Feb 27 16:18:46 localhost kernel: Linux version 5.14.0-686.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Thu Feb 19 10:49:27 UTC 2026
Feb 27 16:18:46 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 27 16:18:46 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-686.el9.x86_64 root=UUID=37391a25-080d-4723-8b0c-cb88a559875b ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 27 16:18:46 localhost kernel: BIOS-provided physical RAM map:
Feb 27 16:18:46 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 27 16:18:46 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 27 16:18:46 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 27 16:18:46 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 27 16:18:46 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 27 16:18:46 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 27 16:18:46 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 27 16:18:46 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Feb 27 16:18:46 localhost kernel: NX (Execute Disable) protection: active
Feb 27 16:18:46 localhost kernel: APIC: Static calls initialized
Feb 27 16:18:46 localhost kernel: SMBIOS 2.8 present.
Feb 27 16:18:46 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 27 16:18:46 localhost kernel: Hypervisor detected: KVM
Feb 27 16:18:46 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 27 16:18:46 localhost kernel: kvm-clock: using sched offset of 10567239079 cycles
Feb 27 16:18:46 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 27 16:18:46 localhost kernel: tsc: Detected 2800.000 MHz processor
Feb 27 16:18:46 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 27 16:18:46 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 27 16:18:46 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Feb 27 16:18:46 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Feb 27 16:18:46 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 27 16:18:46 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 27 16:18:46 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 27 16:18:46 localhost kernel: Using GB pages for direct mapping
Feb 27 16:18:46 localhost kernel: RAMDISK: [mem 0x1b6ca000-0x29b5cfff]
Feb 27 16:18:46 localhost kernel: ACPI: Early table checksum verification disabled
Feb 27 16:18:46 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 27 16:18:46 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 27 16:18:46 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 27 16:18:46 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 27 16:18:46 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 27 16:18:46 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 27 16:18:46 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 27 16:18:46 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 27 16:18:46 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 27 16:18:46 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 27 16:18:46 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 27 16:18:46 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 27 16:18:46 localhost kernel: No NUMA configuration found
Feb 27 16:18:46 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Feb 27 16:18:46 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Feb 27 16:18:46 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Feb 27 16:18:46 localhost kernel: Zone ranges:
Feb 27 16:18:46 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 27 16:18:46 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 27 16:18:46 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Feb 27 16:18:46 localhost kernel:   Device   empty
Feb 27 16:18:46 localhost kernel: Movable zone start for each node
Feb 27 16:18:46 localhost kernel: Early memory node ranges
Feb 27 16:18:46 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 27 16:18:46 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 27 16:18:46 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Feb 27 16:18:46 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Feb 27 16:18:46 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 27 16:18:46 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 27 16:18:46 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 27 16:18:46 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 27 16:18:46 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 27 16:18:46 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 27 16:18:46 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 27 16:18:46 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 27 16:18:46 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 27 16:18:46 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 27 16:18:46 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 27 16:18:46 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 27 16:18:46 localhost kernel: TSC deadline timer available
Feb 27 16:18:46 localhost kernel: CPU topo: Max. logical packages:   8
Feb 27 16:18:46 localhost kernel: CPU topo: Max. logical dies:       8
Feb 27 16:18:46 localhost kernel: CPU topo: Max. dies per package:   1
Feb 27 16:18:46 localhost kernel: CPU topo: Max. threads per core:   1
Feb 27 16:18:46 localhost kernel: CPU topo: Num. cores per package:     1
Feb 27 16:18:46 localhost kernel: CPU topo: Num. threads per package:   1
Feb 27 16:18:46 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Feb 27 16:18:46 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Feb 27 16:18:46 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 27 16:18:46 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 27 16:18:46 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 27 16:18:46 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 27 16:18:46 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 27 16:18:46 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 27 16:18:46 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 27 16:18:46 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 27 16:18:46 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 27 16:18:46 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 27 16:18:46 localhost kernel: Booting paravirtualized kernel on KVM
Feb 27 16:18:46 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 27 16:18:46 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 27 16:18:46 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Feb 27 16:18:46 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Feb 27 16:18:46 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 27 16:18:46 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 27 16:18:46 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-686.el9.x86_64 root=UUID=37391a25-080d-4723-8b0c-cb88a559875b ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 27 16:18:46 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-686.el9.x86_64", will be passed to user space.
Feb 27 16:18:46 localhost kernel: random: crng init done
Feb 27 16:18:46 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 27 16:18:46 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Feb 27 16:18:46 localhost kernel: Fallback order for Node 0: 0 
Feb 27 16:18:46 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Feb 27 16:18:46 localhost kernel: Policy zone: Normal
Feb 27 16:18:46 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 27 16:18:46 localhost kernel: software IO TLB: area num 8.
Feb 27 16:18:46 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 27 16:18:46 localhost kernel: ftrace: allocating 49605 entries in 194 pages
Feb 27 16:18:46 localhost kernel: ftrace: allocated 194 pages with 3 groups
Feb 27 16:18:46 localhost kernel: Dynamic Preempt: voluntary
Feb 27 16:18:46 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 27 16:18:46 localhost kernel: rcu:         RCU event tracing is enabled.
Feb 27 16:18:46 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 27 16:18:46 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 27 16:18:46 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 27 16:18:46 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 27 16:18:46 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 27 16:18:46 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 27 16:18:46 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 27 16:18:46 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 27 16:18:46 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Feb 27 16:18:46 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 27 16:18:46 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 27 16:18:46 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 27 16:18:46 localhost kernel: Console: colour VGA+ 80x25
Feb 27 16:18:46 localhost kernel: printk: console [ttyS0] enabled
Feb 27 16:18:46 localhost kernel: ACPI: Core revision 20230331
Feb 27 16:18:46 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 27 16:18:46 localhost kernel: x2apic enabled
Feb 27 16:18:46 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Feb 27 16:18:46 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 27 16:18:46 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Feb 27 16:18:46 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 27 16:18:46 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 27 16:18:46 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 27 16:18:46 localhost kernel: mitigations: Enabled attack vectors: user_kernel, user_user, guest_host, guest_guest, SMT mitigations: auto
Feb 27 16:18:46 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 27 16:18:46 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 27 16:18:46 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 27 16:18:46 localhost kernel: Speculative Return Stack Overflow: Mitigation: SMT disabled
Feb 27 16:18:46 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 27 16:18:46 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Feb 27 16:18:46 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 27 16:18:46 localhost kernel: active return thunk: retbleed_return_thunk
Feb 27 16:18:46 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 27 16:18:46 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 27 16:18:46 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 27 16:18:46 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 27 16:18:46 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 27 16:18:46 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Feb 27 16:18:46 localhost kernel: Freeing SMP alternatives memory: 40K
Feb 27 16:18:46 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 27 16:18:46 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Feb 27 16:18:46 localhost kernel: landlock: Up and running.
Feb 27 16:18:46 localhost kernel: Yama: becoming mindful.
Feb 27 16:18:46 localhost kernel: SELinux:  Initializing.
Feb 27 16:18:46 localhost kernel: LSM support for eBPF active
Feb 27 16:18:46 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 27 16:18:46 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Feb 27 16:18:46 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 27 16:18:46 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 27 16:18:46 localhost kernel: ... version:                0
Feb 27 16:18:46 localhost kernel: ... bit width:              48
Feb 27 16:18:46 localhost kernel: ... generic registers:      6
Feb 27 16:18:46 localhost kernel: ... value mask:             0000ffffffffffff
Feb 27 16:18:46 localhost kernel: ... max period:             00007fffffffffff
Feb 27 16:18:46 localhost kernel: ... fixed-purpose events:   0
Feb 27 16:18:46 localhost kernel: ... event mask:             000000000000003f
Feb 27 16:18:46 localhost kernel: signal: max sigframe size: 1776
Feb 27 16:18:46 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 27 16:18:46 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 27 16:18:46 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 27 16:18:46 localhost kernel: smpboot: x86: Booting SMP configuration:
Feb 27 16:18:46 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 27 16:18:46 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 27 16:18:46 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Feb 27 16:18:46 localhost kernel: node 0 deferred pages initialised in 11ms
Feb 27 16:18:46 localhost kernel: Memory: 7617716K/8388068K available (16384K kernel code, 5797K rwdata, 13956K rodata, 4204K init, 7172K bss, 764464K reserved, 0K cma-reserved)
Feb 27 16:18:46 localhost kernel: devtmpfs: initialized
Feb 27 16:18:46 localhost kernel: x86/mm: Memory block size: 128MB
Feb 27 16:18:46 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 27 16:18:46 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Feb 27 16:18:46 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 27 16:18:46 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 27 16:18:46 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Feb 27 16:18:46 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 27 16:18:46 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 27 16:18:46 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 27 16:18:46 localhost kernel: audit: type=2000 audit(1772209124.679:1): state=initialized audit_enabled=0 res=1
Feb 27 16:18:46 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 27 16:18:46 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 27 16:18:46 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 27 16:18:46 localhost kernel: cpuidle: using governor menu
Feb 27 16:18:46 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 27 16:18:46 localhost kernel: PCI: Using configuration type 1 for base access
Feb 27 16:18:46 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 27 16:18:46 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 27 16:18:46 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Feb 27 16:18:46 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Feb 27 16:18:46 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Feb 27 16:18:46 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Feb 27 16:18:46 localhost kernel: Demotion targets for Node 0: null
Feb 27 16:18:46 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 27 16:18:46 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 27 16:18:46 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 27 16:18:46 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 27 16:18:46 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 27 16:18:46 localhost kernel: ACPI: Interpreter enabled
Feb 27 16:18:46 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 27 16:18:46 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 27 16:18:46 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 27 16:18:46 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 27 16:18:46 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 27 16:18:46 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 27 16:18:46 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [3] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [4] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [5] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [6] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [7] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [8] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [9] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [10] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [11] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [12] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [13] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [14] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [15] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [16] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [17] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [18] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [19] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [20] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [21] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [22] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [23] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [24] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [25] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [26] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [27] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [28] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [29] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [30] registered
Feb 27 16:18:46 localhost kernel: acpiphp: Slot [31] registered
Feb 27 16:18:46 localhost kernel: PCI host bridge to bus 0000:00
Feb 27 16:18:46 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 27 16:18:46 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 27 16:18:46 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 27 16:18:46 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 27 16:18:46 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Feb 27 16:18:46 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 27 16:18:46 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 27 16:18:46 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Feb 27 16:18:46 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Feb 27 16:18:46 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 27 16:18:46 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Feb 27 16:18:46 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Feb 27 16:18:46 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 27 16:18:46 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 27 16:18:46 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Feb 27 16:18:46 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Feb 27 16:18:46 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 27 16:18:46 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Feb 27 16:18:46 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Feb 27 16:18:46 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Feb 27 16:18:46 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Feb 27 16:18:46 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 27 16:18:46 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Feb 27 16:18:46 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Feb 27 16:18:46 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 27 16:18:46 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Feb 27 16:18:46 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Feb 27 16:18:46 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 27 16:18:46 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 27 16:18:46 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 27 16:18:46 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 27 16:18:46 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 27 16:18:46 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 27 16:18:46 localhost kernel: iommu: Default domain type: Translated
Feb 27 16:18:46 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Feb 27 16:18:46 localhost kernel: SCSI subsystem initialized
Feb 27 16:18:46 localhost kernel: ACPI: bus type USB registered
Feb 27 16:18:46 localhost kernel: usbcore: registered new interface driver usbfs
Feb 27 16:18:46 localhost kernel: usbcore: registered new interface driver hub
Feb 27 16:18:46 localhost kernel: usbcore: registered new device driver usb
Feb 27 16:18:46 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 27 16:18:46 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 27 16:18:46 localhost kernel: PTP clock support registered
Feb 27 16:18:46 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 27 16:18:46 localhost kernel: NetLabel: Initializing
Feb 27 16:18:46 localhost kernel: NetLabel:  domain hash size = 128
Feb 27 16:18:46 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 27 16:18:46 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 27 16:18:46 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 27 16:18:46 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 27 16:18:46 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 27 16:18:46 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 27 16:18:46 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 27 16:18:46 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 27 16:18:46 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 27 16:18:46 localhost kernel: vgaarb: loaded
Feb 27 16:18:46 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 27 16:18:46 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 27 16:18:46 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 27 16:18:46 localhost kernel: pnp: PnP ACPI init
Feb 27 16:18:46 localhost kernel: pnp 00:03: [dma 2]
Feb 27 16:18:46 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 27 16:18:46 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 27 16:18:46 localhost kernel: NET: Registered PF_INET protocol family
Feb 27 16:18:46 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 27 16:18:46 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Feb 27 16:18:46 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 27 16:18:46 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Feb 27 16:18:46 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 27 16:18:46 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Feb 27 16:18:46 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Feb 27 16:18:46 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 27 16:18:46 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Feb 27 16:18:46 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 27 16:18:46 localhost kernel: NET: Registered PF_XDP protocol family
Feb 27 16:18:46 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 27 16:18:46 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 27 16:18:46 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 27 16:18:46 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 27 16:18:46 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 27 16:18:46 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 27 16:18:46 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 27 16:18:46 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 25913 usecs
Feb 27 16:18:46 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 27 16:18:46 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 27 16:18:46 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb 27 16:18:46 localhost kernel: ACPI: bus type thunderbolt registered
Feb 27 16:18:46 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 27 16:18:46 localhost kernel: Initialise system trusted keyrings
Feb 27 16:18:46 localhost kernel: Key type blacklist registered
Feb 27 16:18:46 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Feb 27 16:18:46 localhost kernel: zbud: loaded
Feb 27 16:18:46 localhost kernel: integrity: Platform Keyring initialized
Feb 27 16:18:46 localhost kernel: integrity: Machine keyring initialized
Feb 27 16:18:46 localhost kernel: Freeing initrd memory: 234060K
Feb 27 16:18:46 localhost kernel: NET: Registered PF_ALG protocol family
Feb 27 16:18:46 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 27 16:18:46 localhost kernel: Key type asymmetric registered
Feb 27 16:18:46 localhost kernel: Asymmetric key parser 'x509' registered
Feb 27 16:18:46 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 27 16:18:46 localhost kernel: io scheduler mq-deadline registered
Feb 27 16:18:46 localhost kernel: io scheduler kyber registered
Feb 27 16:18:46 localhost kernel: io scheduler bfq registered
Feb 27 16:18:46 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 27 16:18:46 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 27 16:18:46 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 27 16:18:46 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 27 16:18:46 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 27 16:18:46 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 27 16:18:46 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 27 16:18:46 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 27 16:18:46 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 27 16:18:46 localhost kernel: Non-volatile memory driver v1.3
Feb 27 16:18:46 localhost kernel: rdac: device handler registered
Feb 27 16:18:46 localhost kernel: hp_sw: device handler registered
Feb 27 16:18:46 localhost kernel: emc: device handler registered
Feb 27 16:18:46 localhost kernel: alua: device handler registered
Feb 27 16:18:46 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 27 16:18:46 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 27 16:18:46 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 27 16:18:46 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 27 16:18:46 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 27 16:18:46 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 27 16:18:46 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 27 16:18:46 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-686.el9.x86_64 uhci_hcd
Feb 27 16:18:46 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 27 16:18:46 localhost kernel: hub 1-0:1.0: USB hub found
Feb 27 16:18:46 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 27 16:18:46 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 27 16:18:46 localhost kernel: usbserial: USB Serial support registered for generic
Feb 27 16:18:46 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 27 16:18:46 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 27 16:18:46 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 27 16:18:46 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 27 16:18:46 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 27 16:18:46 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 27 16:18:46 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 27 16:18:46 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-27T16:18:45 UTC (1772209125)
Feb 27 16:18:46 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 27 16:18:46 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Feb 27 16:18:46 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 27 16:18:46 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 27 16:18:46 localhost kernel: usbcore: registered new interface driver usbhid
Feb 27 16:18:46 localhost kernel: usbhid: USB HID core driver
Feb 27 16:18:46 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 27 16:18:46 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 27 16:18:46 localhost kernel: Initializing XFRM netlink socket
Feb 27 16:18:46 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 27 16:18:46 localhost kernel: Segment Routing with IPv6
Feb 27 16:18:46 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 27 16:18:46 localhost kernel: mpls_gso: MPLS GSO support
Feb 27 16:18:46 localhost kernel: IPI shorthand broadcast: enabled
Feb 27 16:18:46 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 27 16:18:46 localhost kernel: AES CTR mode by8 optimization enabled
Feb 27 16:18:46 localhost kernel: sched_clock: Marking stable (1090047430, 140647640)->(1354108889, -123413819)
Feb 27 16:18:46 localhost kernel: registered taskstats version 1
Feb 27 16:18:46 localhost kernel: Loading compiled-in X.509 certificates
Feb 27 16:18:46 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: d9d4cefd3ca2c4957ef0b2e7c6e39a7e4ae16390'
Feb 27 16:18:46 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 27 16:18:46 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 27 16:18:46 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Feb 27 16:18:46 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Feb 27 16:18:46 localhost kernel: Demotion targets for Node 0: null
Feb 27 16:18:46 localhost kernel: page_owner is disabled
Feb 27 16:18:46 localhost kernel: Key type .fscrypt registered
Feb 27 16:18:46 localhost kernel: Key type fscrypt-provisioning registered
Feb 27 16:18:46 localhost kernel: Key type big_key registered
Feb 27 16:18:46 localhost kernel: Key type encrypted registered
Feb 27 16:18:46 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 27 16:18:46 localhost kernel: Loading compiled-in module X.509 certificates
Feb 27 16:18:46 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: d9d4cefd3ca2c4957ef0b2e7c6e39a7e4ae16390'
Feb 27 16:18:46 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 27 16:18:46 localhost kernel: ima: No architecture policies found
Feb 27 16:18:46 localhost kernel: evm: Initialising EVM extended attributes:
Feb 27 16:18:46 localhost kernel: evm: security.selinux
Feb 27 16:18:46 localhost kernel: evm: security.SMACK64 (disabled)
Feb 27 16:18:46 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 27 16:18:46 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 27 16:18:46 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 27 16:18:46 localhost kernel: evm: security.apparmor (disabled)
Feb 27 16:18:46 localhost kernel: evm: security.ima
Feb 27 16:18:46 localhost kernel: evm: security.capability
Feb 27 16:18:46 localhost kernel: evm: HMAC attrs: 0x1
Feb 27 16:18:46 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 27 16:18:46 localhost kernel: Running certificate verification RSA selftest
Feb 27 16:18:46 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 27 16:18:46 localhost kernel: Running certificate verification ECDSA selftest
Feb 27 16:18:46 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Feb 27 16:18:46 localhost kernel: clk: Disabling unused clocks
Feb 27 16:18:46 localhost kernel: Freeing unused decrypted memory: 2028K
Feb 27 16:18:46 localhost kernel: Freeing unused kernel image (initmem) memory: 4204K
Feb 27 16:18:46 localhost kernel: Write protecting the kernel read-only data: 30720k
Feb 27 16:18:46 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 380K
Feb 27 16:18:46 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 27 16:18:46 localhost kernel: Run /init as init process
Feb 27 16:18:46 localhost kernel:   with arguments:
Feb 27 16:18:46 localhost kernel:     /init
Feb 27 16:18:46 localhost kernel:   with environment:
Feb 27 16:18:46 localhost kernel:     HOME=/
Feb 27 16:18:46 localhost kernel:     TERM=linux
Feb 27 16:18:46 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-686.el9.x86_64
Feb 27 16:18:46 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 27 16:18:46 localhost systemd[1]: Detected virtualization kvm.
Feb 27 16:18:46 localhost systemd[1]: Detected architecture x86-64.
Feb 27 16:18:46 localhost systemd[1]: Running in initrd.
Feb 27 16:18:46 localhost systemd[1]: No hostname configured, using default hostname.
Feb 27 16:18:46 localhost systemd[1]: Hostname set to <localhost>.
Feb 27 16:18:46 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 27 16:18:46 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 27 16:18:46 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 27 16:18:46 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 27 16:18:46 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 27 16:18:46 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 27 16:18:46 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 27 16:18:46 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 27 16:18:46 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 27 16:18:46 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 27 16:18:46 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 27 16:18:46 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 27 16:18:46 localhost systemd[1]: Reached target Local File Systems.
Feb 27 16:18:46 localhost systemd[1]: Reached target Path Units.
Feb 27 16:18:46 localhost systemd[1]: Reached target Slice Units.
Feb 27 16:18:46 localhost systemd[1]: Reached target Swaps.
Feb 27 16:18:46 localhost systemd[1]: Reached target Timer Units.
Feb 27 16:18:46 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 27 16:18:46 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 27 16:18:46 localhost systemd[1]: Listening on Journal Socket.
Feb 27 16:18:46 localhost systemd[1]: Listening on udev Control Socket.
Feb 27 16:18:46 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 27 16:18:46 localhost systemd[1]: Reached target Socket Units.
Feb 27 16:18:46 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 27 16:18:46 localhost systemd[1]: Starting Journal Service...
Feb 27 16:18:46 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 27 16:18:46 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 27 16:18:46 localhost systemd[1]: Starting Create System Users...
Feb 27 16:18:46 localhost systemd[1]: Starting Setup Virtual Console...
Feb 27 16:18:46 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 27 16:18:46 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 27 16:18:46 localhost systemd[1]: Finished Create System Users.
Feb 27 16:18:46 localhost systemd-journald[305]: Journal started
Feb 27 16:18:46 localhost systemd-journald[305]: Runtime Journal (/run/log/journal/1b296e3637ac4d9b9bcde79d835197c3) is 8.0M, max 153.6M, 145.6M free.
Feb 27 16:18:46 localhost systemd-sysusers[309]: Creating group 'users' with GID 100.
Feb 27 16:18:46 localhost systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Feb 27 16:18:46 localhost systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 27 16:18:46 localhost systemd[1]: Started Journal Service.
Feb 27 16:18:46 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 27 16:18:46 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 27 16:18:46 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 27 16:18:46 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 27 16:18:46 localhost systemd[1]: Finished Setup Virtual Console.
Feb 27 16:18:46 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 27 16:18:46 localhost systemd[1]: Starting dracut cmdline hook...
Feb 27 16:18:46 localhost dracut-cmdline[324]: dracut-9 dracut-057-110.git20260130.el9
Feb 27 16:18:46 localhost dracut-cmdline[324]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-686.el9.x86_64 root=UUID=37391a25-080d-4723-8b0c-cb88a559875b ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Feb 27 16:18:46 localhost systemd[1]: Finished dracut cmdline hook.
Feb 27 16:18:46 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 27 16:18:46 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 27 16:18:46 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 27 16:18:46 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Feb 27 16:18:46 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 27 16:18:46 localhost kernel: RPC: Registered udp transport module.
Feb 27 16:18:46 localhost kernel: RPC: Registered tcp transport module.
Feb 27 16:18:46 localhost kernel: RPC: Registered tcp-with-tls transport module.
Feb 27 16:18:46 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 27 16:18:46 localhost rpc.statd[441]: Version 2.5.4 starting
Feb 27 16:18:46 localhost rpc.statd[441]: Initializing NSM state
Feb 27 16:18:46 localhost rpc.idmapd[446]: Setting log level to 0
Feb 27 16:18:46 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 27 16:18:46 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 27 16:18:46 localhost systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Feb 27 16:18:46 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 27 16:18:46 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 27 16:18:46 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 27 16:18:46 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 27 16:18:46 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 27 16:18:46 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 27 16:18:46 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 27 16:18:46 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 27 16:18:46 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 27 16:18:46 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 27 16:18:46 localhost systemd[1]: Reached target Network.
Feb 27 16:18:47 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Feb 27 16:18:47 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 27 16:18:47 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Feb 27 16:18:47 localhost kernel:  vda: vda1
Feb 27 16:18:47 localhost systemd[1]: Starting dracut initqueue hook...
Feb 27 16:18:47 localhost kernel: libata version 3.00 loaded.
Feb 27 16:18:47 localhost kernel: ACPI: bus type drm_connector registered
Feb 27 16:18:47 localhost systemd[1]: Found device /dev/disk/by-uuid/37391a25-080d-4723-8b0c-cb88a559875b.
Feb 27 16:18:47 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 27 16:18:47 localhost kernel: scsi host0: ata_piix
Feb 27 16:18:47 localhost kernel: scsi host1: ata_piix
Feb 27 16:18:47 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Feb 27 16:18:47 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Feb 27 16:18:47 localhost systemd[1]: Reached target Initrd Root Device.
Feb 27 16:18:47 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 27 16:18:47 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 27 16:18:47 localhost systemd[1]: Reached target System Initialization.
Feb 27 16:18:47 localhost systemd[1]: Reached target Basic System.
Feb 27 16:18:47 localhost kernel: ata1: found unknown device (class 0)
Feb 27 16:18:47 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 27 16:18:47 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 27 16:18:47 localhost systemd-udevd[461]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 16:18:47 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 27 16:18:47 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 27 16:18:47 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 27 16:18:47 localhost kernel: Console: switching to colour dummy device 80x25
Feb 27 16:18:47 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 27 16:18:47 localhost kernel: [drm] features: -context_init
Feb 27 16:18:47 localhost kernel: [drm] number of scanouts: 1
Feb 27 16:18:47 localhost kernel: [drm] number of cap sets: 0
Feb 27 16:18:47 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Feb 27 16:18:47 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Feb 27 16:18:47 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 27 16:18:47 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 27 16:18:47 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 27 16:18:47 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 27 16:18:47 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 27 16:18:47 localhost systemd[1]: Finished dracut initqueue hook.
Feb 27 16:18:47 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 27 16:18:47 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 27 16:18:47 localhost systemd[1]: Reached target Remote File Systems.
Feb 27 16:18:47 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 27 16:18:47 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 27 16:18:47 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/37391a25-080d-4723-8b0c-cb88a559875b...
Feb 27 16:18:47 localhost systemd-fsck[564]: /usr/sbin/fsck.xfs: XFS file system.
Feb 27 16:18:47 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/37391a25-080d-4723-8b0c-cb88a559875b.
Feb 27 16:18:47 localhost systemd[1]: Mounting /sysroot...
Feb 27 16:18:48 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 27 16:18:48 localhost kernel: XFS (vda1): Mounting V5 Filesystem 37391a25-080d-4723-8b0c-cb88a559875b
Feb 27 16:18:48 localhost kernel: XFS (vda1): Ending clean mount
Feb 27 16:18:48 localhost systemd[1]: Mounted /sysroot.
Feb 27 16:18:48 localhost systemd[1]: Reached target Initrd Root File System.
Feb 27 16:18:48 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 27 16:18:48 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 27 16:18:48 localhost systemd[1]: Reached target Initrd File Systems.
Feb 27 16:18:48 localhost systemd[1]: Reached target Initrd Default Target.
Feb 27 16:18:48 localhost systemd[1]: Starting dracut mount hook...
Feb 27 16:18:48 localhost systemd[1]: Finished dracut mount hook.
Feb 27 16:18:48 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 27 16:18:48 localhost rpc.idmapd[446]: exiting on signal 15
Feb 27 16:18:48 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 27 16:18:48 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 27 16:18:48 localhost systemd[1]: Stopped target Network.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Timer Units.
Feb 27 16:18:48 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 27 16:18:48 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Basic System.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Path Units.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Remote File Systems.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Slice Units.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Socket Units.
Feb 27 16:18:48 localhost systemd[1]: Stopped target System Initialization.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Local File Systems.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Swaps.
Feb 27 16:18:48 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped dracut mount hook.
Feb 27 16:18:48 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 27 16:18:48 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 27 16:18:48 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 27 16:18:48 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 27 16:18:48 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 27 16:18:48 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 27 16:18:48 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 27 16:18:48 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 27 16:18:48 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 27 16:18:48 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 27 16:18:48 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 27 16:18:48 localhost systemd[1]: systemd-udevd.service: Consumed 1.004s CPU time.
Feb 27 16:18:48 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 27 16:18:48 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Closed udev Control Socket.
Feb 27 16:18:48 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Closed udev Kernel Socket.
Feb 27 16:18:48 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 27 16:18:48 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 27 16:18:48 localhost systemd[1]: Starting Cleanup udev Database...
Feb 27 16:18:48 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 27 16:18:48 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 27 16:18:48 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Stopped Create System Users.
Feb 27 16:18:48 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 27 16:18:48 localhost systemd[1]: Finished Cleanup udev Database.
Feb 27 16:18:48 localhost systemd[1]: Reached target Switch Root.
Feb 27 16:18:48 localhost systemd[1]: Starting Switch Root...
Feb 27 16:18:48 localhost systemd[1]: Switching root.
Feb 27 16:18:48 localhost systemd-journald[305]: Journal stopped
Feb 27 16:18:49 localhost systemd-journald[305]: Received SIGTERM from PID 1 (systemd).
Feb 27 16:18:49 localhost kernel: audit: type=1404 audit(1772209128.630:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 27 16:18:49 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 27 16:18:49 localhost kernel: SELinux:  policy capability open_perms=1
Feb 27 16:18:49 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 27 16:18:49 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 27 16:18:49 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 27 16:18:49 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 27 16:18:49 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 27 16:18:49 localhost kernel: audit: type=1403 audit(1772209128.758:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 27 16:18:49 localhost systemd[1]: Successfully loaded SELinux policy in 130.601ms.
Feb 27 16:18:49 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 41.384ms.
Feb 27 16:18:49 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 27 16:18:49 localhost systemd[1]: Detected virtualization kvm.
Feb 27 16:18:49 localhost systemd[1]: Detected architecture x86-64.
Feb 27 16:18:49 localhost systemd-rc-local-generator[645]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:18:49 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 27 16:18:49 localhost systemd[1]: Stopped Switch Root.
Feb 27 16:18:49 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 27 16:18:49 localhost systemd[1]: Created slice Slice /system/getty.
Feb 27 16:18:49 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 27 16:18:49 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 27 16:18:49 localhost systemd[1]: Created slice User and Session Slice.
Feb 27 16:18:49 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 27 16:18:49 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 27 16:18:49 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 27 16:18:49 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 27 16:18:49 localhost systemd[1]: Stopped target Switch Root.
Feb 27 16:18:49 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 27 16:18:49 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 27 16:18:49 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 27 16:18:49 localhost systemd[1]: Reached target Path Units.
Feb 27 16:18:49 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 27 16:18:49 localhost systemd[1]: Reached target Slice Units.
Feb 27 16:18:49 localhost systemd[1]: Reached target Swaps.
Feb 27 16:18:49 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 27 16:18:49 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 27 16:18:49 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 27 16:18:49 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 27 16:18:49 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 27 16:18:49 localhost systemd[1]: Listening on udev Control Socket.
Feb 27 16:18:49 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 27 16:18:49 localhost systemd[1]: Mounting Huge Pages File System...
Feb 27 16:18:49 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 27 16:18:49 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 27 16:18:49 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 27 16:18:49 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 27 16:18:49 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 27 16:18:49 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 27 16:18:49 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 27 16:18:49 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Feb 27 16:18:49 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 27 16:18:49 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 27 16:18:49 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 27 16:18:49 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 27 16:18:49 localhost systemd[1]: Stopped Journal Service.
Feb 27 16:18:49 localhost systemd[1]: Starting Journal Service...
Feb 27 16:18:49 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Feb 27 16:18:49 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 27 16:18:49 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 27 16:18:49 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 27 16:18:49 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 27 16:18:49 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 27 16:18:49 localhost kernel: fuse: init (API version 7.37)
Feb 27 16:18:49 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 27 16:18:49 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 27 16:18:49 localhost systemd-journald[693]: Journal started
Feb 27 16:18:49 localhost systemd-journald[693]: Runtime Journal (/run/log/journal/45af4031c1bdc072f1f045c25038675f) is 8.0M, max 153.6M, 145.6M free.
Feb 27 16:18:49 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 27 16:18:49 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 27 16:18:49 localhost systemd[1]: Started Journal Service.
Feb 27 16:18:49 localhost systemd[1]: Mounted Huge Pages File System.
Feb 27 16:18:49 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 27 16:18:49 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 27 16:18:49 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 27 16:18:49 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 27 16:18:49 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 27 16:18:49 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 27 16:18:49 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 27 16:18:49 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 27 16:18:49 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Feb 27 16:18:49 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Feb 27 16:18:49 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 27 16:18:49 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 27 16:18:49 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 27 16:18:49 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 27 16:18:49 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 27 16:18:49 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 27 16:18:49 localhost systemd[1]: Mounting FUSE Control File System...
Feb 27 16:18:49 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 27 16:18:49 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 27 16:18:49 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 27 16:18:49 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Feb 27 16:18:49 localhost systemd[1]: Starting Load/Save OS Random Seed...
Feb 27 16:18:49 localhost systemd[1]: Starting Create System Users...
Feb 27 16:18:49 localhost systemd[1]: Mounted FUSE Control File System.
Feb 27 16:18:49 localhost systemd-journald[693]: Runtime Journal (/run/log/journal/45af4031c1bdc072f1f045c25038675f) is 8.0M, max 153.6M, 145.6M free.
Feb 27 16:18:49 localhost systemd-journald[693]: Received client request to flush runtime journal.
Feb 27 16:18:49 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 27 16:18:49 localhost systemd[1]: Finished Load/Save OS Random Seed.
Feb 27 16:18:49 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 27 16:18:49 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 27 16:18:49 localhost systemd[1]: Finished Create System Users.
Feb 27 16:18:49 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 27 16:18:49 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 27 16:18:49 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 27 16:18:49 localhost systemd[1]: Reached target Local File Systems.
Feb 27 16:18:49 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 27 16:18:49 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 27 16:18:49 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 27 16:18:49 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Feb 27 16:18:49 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 27 16:18:49 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 27 16:18:49 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 27 16:18:49 localhost bootctl[710]: Couldn't find EFI system partition, skipping.
Feb 27 16:18:49 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 27 16:18:49 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 27 16:18:49 localhost systemd[1]: Starting Security Auditing Service...
Feb 27 16:18:49 localhost systemd[1]: Starting RPC Bind...
Feb 27 16:18:49 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 27 16:18:49 localhost auditd[716]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Feb 27 16:18:49 localhost auditd[716]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Feb 27 16:18:49 localhost systemd[1]: Started RPC Bind.
Feb 27 16:18:49 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 27 16:18:49 localhost augenrules[721]: /sbin/augenrules: No change
Feb 27 16:18:49 localhost augenrules[736]: No rules
Feb 27 16:18:49 localhost augenrules[736]: enabled 1
Feb 27 16:18:49 localhost augenrules[736]: failure 1
Feb 27 16:18:49 localhost augenrules[736]: pid 716
Feb 27 16:18:49 localhost augenrules[736]: rate_limit 0
Feb 27 16:18:49 localhost augenrules[736]: backlog_limit 8192
Feb 27 16:18:49 localhost augenrules[736]: lost 0
Feb 27 16:18:49 localhost augenrules[736]: backlog 3
Feb 27 16:18:49 localhost augenrules[736]: backlog_wait_time 60000
Feb 27 16:18:49 localhost augenrules[736]: backlog_wait_time_actual 0
Feb 27 16:18:49 localhost augenrules[736]: enabled 1
Feb 27 16:18:49 localhost augenrules[736]: failure 1
Feb 27 16:18:49 localhost augenrules[736]: pid 716
Feb 27 16:18:49 localhost augenrules[736]: rate_limit 0
Feb 27 16:18:49 localhost augenrules[736]: backlog_limit 8192
Feb 27 16:18:49 localhost augenrules[736]: lost 0
Feb 27 16:18:49 localhost augenrules[736]: backlog 3
Feb 27 16:18:49 localhost augenrules[736]: backlog_wait_time 60000
Feb 27 16:18:49 localhost augenrules[736]: backlog_wait_time_actual 0
Feb 27 16:18:49 localhost augenrules[736]: enabled 1
Feb 27 16:18:49 localhost augenrules[736]: failure 1
Feb 27 16:18:49 localhost augenrules[736]: pid 716
Feb 27 16:18:49 localhost augenrules[736]: rate_limit 0
Feb 27 16:18:49 localhost augenrules[736]: backlog_limit 8192
Feb 27 16:18:49 localhost augenrules[736]: lost 0
Feb 27 16:18:49 localhost augenrules[736]: backlog 4
Feb 27 16:18:49 localhost augenrules[736]: backlog_wait_time 60000
Feb 27 16:18:49 localhost augenrules[736]: backlog_wait_time_actual 0
Feb 27 16:18:49 localhost systemd[1]: Started Security Auditing Service.
Feb 27 16:18:49 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 27 16:18:49 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 27 16:18:50 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 27 16:18:50 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 27 16:18:50 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 27 16:18:50 localhost systemd[1]: Starting Update is Completed...
Feb 27 16:18:50 localhost systemd[1]: Finished Update is Completed.
Feb 27 16:18:50 localhost systemd-udevd[744]: Using default interface naming scheme 'rhel-9.0'.
Feb 27 16:18:50 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 27 16:18:50 localhost systemd[1]: Reached target System Initialization.
Feb 27 16:18:50 localhost systemd[1]: Started dnf makecache --timer.
Feb 27 16:18:50 localhost systemd[1]: Started Daily rotation of log files.
Feb 27 16:18:50 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 27 16:18:50 localhost systemd[1]: Reached target Timer Units.
Feb 27 16:18:50 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 27 16:18:50 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 27 16:18:50 localhost systemd[1]: Reached target Socket Units.
Feb 27 16:18:50 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 27 16:18:50 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 27 16:18:50 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 27 16:18:50 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 27 16:18:50 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 27 16:18:50 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 27 16:18:50 localhost systemd-udevd[755]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 16:18:50 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 27 16:18:50 localhost systemd[1]: Reached target Basic System.
Feb 27 16:18:50 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 27 16:18:50 localhost dbus-broker-lau[779]: Ready
Feb 27 16:18:50 localhost systemd[1]: Starting NTP client/server...
Feb 27 16:18:50 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 27 16:18:50 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Feb 27 16:18:50 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Feb 27 16:18:50 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Feb 27 16:18:50 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 27 16:18:50 localhost systemd[1]: Starting IPv4 firewall with iptables...
Feb 27 16:18:50 localhost systemd[1]: Started irqbalance daemon.
Feb 27 16:18:50 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 27 16:18:50 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 27 16:18:50 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 27 16:18:50 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 27 16:18:50 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 27 16:18:50 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 27 16:18:50 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 27 16:18:50 localhost systemd[1]: Starting User Login Management...
Feb 27 16:18:50 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 27 16:18:50 localhost systemd-logind[803]: New seat seat0.
Feb 27 16:18:50 localhost systemd-logind[803]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 27 16:18:50 localhost systemd-logind[803]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 27 16:18:50 localhost systemd[1]: Started User Login Management.
Feb 27 16:18:50 localhost chronyd[817]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 27 16:18:50 localhost chronyd[817]: Loaded 0 symmetric keys
Feb 27 16:18:50 localhost chronyd[817]: Using right/UTC timezone to obtain leap second data
Feb 27 16:18:50 localhost chronyd[817]: Loaded seccomp filter (level 2)
Feb 27 16:18:50 localhost systemd[1]: Started NTP client/server.
Feb 27 16:18:50 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 27 16:18:50 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Feb 27 16:18:50 localhost kernel: kvm_amd: TSC scaling supported
Feb 27 16:18:50 localhost kernel: kvm_amd: Nested Virtualization enabled
Feb 27 16:18:50 localhost kernel: kvm_amd: Nested Paging enabled
Feb 27 16:18:50 localhost kernel: kvm_amd: LBR virtualization supported
Feb 27 16:18:50 localhost iptables.init[795]: iptables: Applying firewall rules: [  OK  ]
Feb 27 16:18:50 localhost systemd[1]: Finished IPv4 firewall with iptables.
Feb 27 16:18:50 localhost cloud-init[847]: Cloud-init v. 24.4-8.el9 running 'init-local' at Fri, 27 Feb 2026 16:18:50 +0000. Up 6.49 seconds.
Feb 27 16:18:51 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 27 16:18:51 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 27 16:18:51 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpct3koits.mount: Deactivated successfully.
Feb 27 16:18:51 localhost systemd[1]: Starting Hostname Service...
Feb 27 16:18:51 localhost systemd[1]: Started Hostname Service.
Feb 27 16:18:51 np0005633116.novalocal systemd-hostnamed[861]: Hostname set to <np0005633116.novalocal> (static)
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Reached target Preparation for Network.
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Starting Network Manager...
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5336] NetworkManager (version 1.54.3-2.el9) is starting... (boot:7134621a-8b85-4cf7-b630-8b1bd86c0689)
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5342] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5502] manager[0x55bb099fb000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5559] hostname: hostname: using hostnamed
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5560] hostname: static hostname changed from (none) to "np0005633116.novalocal"
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5565] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5660] manager[0x55bb099fb000]: rfkill: Wi-Fi hardware radio set enabled
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5661] manager[0x55bb099fb000]: rfkill: WWAN hardware radio set enabled
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5774] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5774] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5775] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5776] manager: Networking is enabled by state file
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5778] settings: Loaded settings plugin: keyfile (internal)
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5812] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5837] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5854] dhcp: init: Using DHCP client 'internal'
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5857] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5868] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5894] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5905] device (lo): Activation: starting connection 'lo' (64e865f2-9b77-47c6-8998-ed75b7b9b4c4)
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5911] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5915] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5969] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5973] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5976] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Started Network Manager.
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5977] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5981] device (eth0): carrier: link connected
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5984] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5987] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5994] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5998] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.5999] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Reached target Network.
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6001] manager: NetworkManager state is now CONNECTING
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6003] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6008] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6010] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6058] dhcp4 (eth0): state changed new lease, address=38.129.56.53
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6064] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6080] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6156] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6159] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6162] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6168] device (lo): Activation: successful, device activated.
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6174] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6176] manager: NetworkManager state is now CONNECTED_SITE
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6181] device (eth0): Activation: successful, device activated.
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6188] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 27 16:18:51 np0005633116.novalocal NetworkManager[865]: <info>  [1772209131.6190] manager: startup complete
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Reached target NFS client services.
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Reached target Remote File Systems.
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 27 16:18:51 np0005633116.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: Cloud-init v. 24.4-8.el9 running 'init' at Fri, 27 Feb 2026 16:18:51 +0000. Up 7.47 seconds.
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: |  eth0  | True |         38.129.56.53         | 255.255.255.0 | global | fa:16:3e:68:59:92 |
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: |  eth0  | True | fe80::f816:3eff:fe68:5992/64 |       .       |  link  | fa:16:3e:68:59:92 |
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: | Route |   Destination   |   Gateway   |     Genmask     | Interface | Flags |
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: |   0   |     0.0.0.0     | 38.129.56.1 |     0.0.0.0     |    eth0   |   UG  |
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: |   1   |   38.129.56.0   |   0.0.0.0   |  255.255.255.0  |    eth0   |   U   |
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: |   2   | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 |    eth0   |  UGH  |
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 27 16:18:51 np0005633116.novalocal cloud-init[929]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb 27 16:18:52 np0005633116.novalocal cloud-init[929]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 27 16:18:53 np0005633116.novalocal useradd[995]: new group: name=cloud-user, GID=1001
Feb 27 16:18:53 np0005633116.novalocal useradd[995]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 27 16:18:53 np0005633116.novalocal useradd[995]: add 'cloud-user' to group 'adm'
Feb 27 16:18:53 np0005633116.novalocal useradd[995]: add 'cloud-user' to group 'systemd-journal'
Feb 27 16:18:53 np0005633116.novalocal useradd[995]: add 'cloud-user' to shadow group 'adm'
Feb 27 16:18:53 np0005633116.novalocal useradd[995]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: Generating public/private rsa key pair.
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: The key fingerprint is:
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: SHA256:ON4dOe+B7JjJVfFQtUO+CcqBvMxRrObefYXBA66A04I root@np0005633116.novalocal
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: The key's randomart image is:
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: +---[RSA 3072]----+
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |          ..  .o.|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |        . o...o .|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |     . o +.oo+ + |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |    E +.=o+.++= +|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |      ooS=++. .* |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |     . o +.*  . .|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |      . o * +   .|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |       . B o o . |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |        = . . .  |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: +----[SHA256]-----+
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: Generating public/private ecdsa key pair.
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: The key fingerprint is:
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: SHA256:LRm06ITdh0DM43+M/Y+CNBF9CfHZxCyUt5cWtFBpqtc root@np0005633116.novalocal
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: The key's randomart image is:
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: +---[ECDSA 256]---+
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |     +o ..o+.*+o.|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |     o+=.o..=+=+.|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |    ..+.=...oo+oo|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |     o. .=   ..o.|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |      ..S=. . o. |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |        =.+. . E |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |       . + ..    |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |        . . ..   |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |           ....  |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: +----[SHA256]-----+
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: Generating public/private ed25519 key pair.
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: The key fingerprint is:
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: SHA256:iC+RI/T6Mldf2rdcyHY9jco7Ba2BWqi/aTeVl/lHKyQ root@np0005633116.novalocal
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: The key's randomart image is:
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: +--[ED25519 256]--+
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |                 |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |                 |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |  .      . . .   |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: | . . o .. o o .  |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |  . * ..So   = o |
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |   o +o . .E+o=oo|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |  . ...o + .*o+++|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |  o...  =.++o= .+|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: |   +.  .oo o*+. .|
Feb 27 16:18:53 np0005633116.novalocal cloud-init[929]: +----[SHA256]-----+
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Reached target Network is Online.
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Starting System Logging Service...
Feb 27 16:18:53 np0005633116.novalocal sm-notify[1011]: Version 2.5.4 starting
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Starting Permit User Sessions...
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 27 16:18:53 np0005633116.novalocal sshd[1013]: Server listening on 0.0.0.0 port 22.
Feb 27 16:18:53 np0005633116.novalocal sshd[1013]: Server listening on :: port 22.
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Finished Permit User Sessions.
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Started Command Scheduler.
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Started Getty on tty1.
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 27 16:18:53 np0005633116.novalocal crond[1016]: (CRON) STARTUP (1.5.7)
Feb 27 16:18:53 np0005633116.novalocal crond[1016]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Reached target Login Prompts.
Feb 27 16:18:53 np0005633116.novalocal crond[1016]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 40% if used.)
Feb 27 16:18:53 np0005633116.novalocal crond[1016]: (CRON) INFO (running with inotify support)
Feb 27 16:18:53 np0005633116.novalocal rsyslogd[1012]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1012" x-info="https://www.rsyslog.com"] start
Feb 27 16:18:53 np0005633116.novalocal rsyslogd[1012]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Started System Logging Service.
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Reached target Multi-User System.
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 27 16:18:53 np0005633116.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 27 16:18:53 np0005633116.novalocal sshd-session[1057]: Unable to negotiate with 38.102.83.114 port 38340: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 27 16:18:53 np0005633116.novalocal rsyslogd[1012]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 27 16:18:53 np0005633116.novalocal sshd-session[1081]: Unable to negotiate with 38.102.83.114 port 38364: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 27 16:18:53 np0005633116.novalocal sshd-session[1096]: Unable to negotiate with 38.102.83.114 port 38372: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 27 16:18:53 np0005633116.novalocal sshd-session[1027]: Connection closed by 38.102.83.114 port 38336 [preauth]
Feb 27 16:18:53 np0005633116.novalocal sshd-session[1067]: Connection closed by 38.102.83.114 port 38352 [preauth]
Feb 27 16:18:53 np0005633116.novalocal sshd-session[1146]: Unable to negotiate with 38.102.83.114 port 38406: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Feb 27 16:18:54 np0005633116.novalocal sshd-session[1154]: Unable to negotiate with 38.102.83.114 port 38412: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 27 16:18:54 np0005633116.novalocal cloud-init[1157]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Fri, 27 Feb 2026 16:18:53 +0000. Up 9.53 seconds.
Feb 27 16:18:54 np0005633116.novalocal sshd-session[1112]: Connection closed by 38.102.83.114 port 38386 [preauth]
Feb 27 16:18:54 np0005633116.novalocal sshd-session[1125]: Connection closed by 38.102.83.114 port 38402 [preauth]
Feb 27 16:18:54 np0005633116.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Feb 27 16:18:54 np0005633116.novalocal kdumpctl[1021]: kdump: No kdump initial ramdisk found.
Feb 27 16:18:54 np0005633116.novalocal kdumpctl[1021]: kdump: Rebuilding /boot/initramfs-5.14.0-686.el9.x86_64kdump.img
Feb 27 16:18:54 np0005633116.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Feb 27 16:18:54 np0005633116.novalocal cloud-init[1378]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Fri, 27 Feb 2026 16:18:54 +0000. Up 9.93 seconds.
Feb 27 16:18:54 np0005633116.novalocal cloud-init[1435]: #############################################################
Feb 27 16:18:54 np0005633116.novalocal cloud-init[1449]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 27 16:18:54 np0005633116.novalocal cloud-init[1458]: 256 SHA256:LRm06ITdh0DM43+M/Y+CNBF9CfHZxCyUt5cWtFBpqtc root@np0005633116.novalocal (ECDSA)
Feb 27 16:18:54 np0005633116.novalocal cloud-init[1468]: 256 SHA256:iC+RI/T6Mldf2rdcyHY9jco7Ba2BWqi/aTeVl/lHKyQ root@np0005633116.novalocal (ED25519)
Feb 27 16:18:54 np0005633116.novalocal cloud-init[1474]: 3072 SHA256:ON4dOe+B7JjJVfFQtUO+CcqBvMxRrObefYXBA66A04I root@np0005633116.novalocal (RSA)
Feb 27 16:18:54 np0005633116.novalocal cloud-init[1475]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 27 16:18:54 np0005633116.novalocal cloud-init[1477]: #############################################################
Feb 27 16:18:54 np0005633116.novalocal cloud-init[1378]: Cloud-init v. 24.4-8.el9 finished at Fri, 27 Feb 2026 16:18:54 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.14 seconds
Feb 27 16:18:54 np0005633116.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Feb 27 16:18:54 np0005633116.novalocal systemd[1]: Reached target Cloud-init target.
Feb 27 16:18:54 np0005633116.novalocal dracut[1535]: dracut-057-110.git20260130.el9
Feb 27 16:18:54 np0005633116.novalocal dracut[1537]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/37391a25-080d-4723-8b0c-cb88a559875b /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-686.el9.x86_64kdump.img 5.14.0-686.el9.x86_64
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: Module 'resume' will not be installed, because it's in the list to be omitted!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 27 16:18:55 np0005633116.novalocal dracut[1537]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: memstrack is not available
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: memstrack is not available
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: *** Including module: systemd ***
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: *** Including module: fips ***
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: *** Including module: systemd-initrd ***
Feb 27 16:18:56 np0005633116.novalocal dracut[1537]: *** Including module: i18n ***
Feb 27 16:18:57 np0005633116.novalocal dracut[1537]: *** Including module: drm ***
Feb 27 16:18:57 np0005633116.novalocal dracut[1537]: *** Including module: prefixdevname ***
Feb 27 16:18:57 np0005633116.novalocal dracut[1537]: *** Including module: kernel-modules ***
Feb 27 16:18:57 np0005633116.novalocal kernel: block vda: the capability attribute has been deprecated.
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]: *** Including module: kernel-modules-extra ***
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]: *** Including module: qemu ***
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]: *** Including module: fstab-sys ***
Feb 27 16:18:58 np0005633116.novalocal chronyd[817]: Selected source 147.189.136.126 (2.centos.pool.ntp.org)
Feb 27 16:18:58 np0005633116.novalocal chronyd[817]: System clock TAI offset set to 37 seconds
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]: *** Including module: rootfs-block ***
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]: *** Including module: terminfo ***
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]: *** Including module: udev-rules ***
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]: Skipping udev rule: 91-permissions.rules
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]: *** Including module: virtiofs ***
Feb 27 16:18:58 np0005633116.novalocal dracut[1537]: *** Including module: dracut-systemd ***
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]: *** Including module: usrmount ***
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]: *** Including module: base ***
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]: *** Including module: fs-lib ***
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]: *** Including module: kdumpbase ***
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:   microcode_ctl module: mangling fw_dir
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: configuration "intel" is ignored
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 27 16:18:59 np0005633116.novalocal dracut[1537]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]: *** Including module: openssl ***
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]: *** Including module: shutdown ***
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]: *** Including module: squash ***
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]: *** Including modules done ***
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]: *** Installing kernel module dependencies ***
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: Cannot change IRQ 25 affinity: Operation not permitted
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: IRQ 25 affinity is now unmanaged
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: Cannot change IRQ 31 affinity: Operation not permitted
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: IRQ 31 affinity is now unmanaged
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: Cannot change IRQ 28 affinity: Operation not permitted
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: IRQ 28 affinity is now unmanaged
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: Cannot change IRQ 32 affinity: Operation not permitted
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: IRQ 32 affinity is now unmanaged
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: Cannot change IRQ 30 affinity: Operation not permitted
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: IRQ 30 affinity is now unmanaged
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: Cannot change IRQ 29 affinity: Operation not permitted
Feb 27 16:19:00 np0005633116.novalocal irqbalance[796]: IRQ 29 affinity is now unmanaged
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]: *** Installing kernel module dependencies done ***
Feb 27 16:19:00 np0005633116.novalocal dracut[1537]: *** Resolving executable dependencies ***
Feb 27 16:19:01 np0005633116.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 27 16:19:02 np0005633116.novalocal dracut[1537]: *** Resolving executable dependencies done ***
Feb 27 16:19:02 np0005633116.novalocal dracut[1537]: *** Generating early-microcode cpio image ***
Feb 27 16:19:02 np0005633116.novalocal dracut[1537]: *** Store current command line parameters ***
Feb 27 16:19:02 np0005633116.novalocal dracut[1537]: Stored kernel commandline:
Feb 27 16:19:02 np0005633116.novalocal dracut[1537]: No dracut internal kernel commandline stored in the initramfs
Feb 27 16:19:02 np0005633116.novalocal dracut[1537]: *** Install squash loader ***
Feb 27 16:19:03 np0005633116.novalocal dracut[1537]: *** Squashing the files inside the initramfs ***
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: *** Squashing the files inside the initramfs done ***
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: *** Creating image file '/boot/initramfs-5.14.0-686.el9.x86_64kdump.img' ***
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: *** Hardlinking files ***
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: Mode:           real
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: Files:          50
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: Linked:         0 files
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: Compared:       0 xattrs
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: Compared:       0 files
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: Saved:          0 B
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: Duration:       0.000340 seconds
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: *** Hardlinking files done ***
Feb 27 16:19:04 np0005633116.novalocal dracut[1537]: *** Creating initramfs image file '/boot/initramfs-5.14.0-686.el9.x86_64kdump.img' done ***
Feb 27 16:19:05 np0005633116.novalocal kdumpctl[1021]: kdump: kexec: loaded kdump kernel
Feb 27 16:19:05 np0005633116.novalocal kdumpctl[1021]: kdump: Starting kdump: [OK]
Feb 27 16:19:05 np0005633116.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 27 16:19:05 np0005633116.novalocal systemd[1]: Startup finished in 1.417s (kernel) + 2.746s (initrd) + 16.701s (userspace) = 20.866s.
Feb 27 16:19:18 np0005633116.novalocal sshd-session[4788]: Accepted publickey for zuul from 38.102.83.114 port 57870 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 27 16:19:18 np0005633116.novalocal systemd-logind[803]: New session 1 of user zuul.
Feb 27 16:19:18 np0005633116.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 27 16:19:18 np0005633116.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 27 16:19:18 np0005633116.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 27 16:19:18 np0005633116.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Queued start job for default target Main User Target.
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Created slice User Application Slice.
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Started Daily Cleanup of User's Temporary Directories.
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Reached target Paths.
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Reached target Timers.
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Starting D-Bus User Message Bus Socket...
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Starting Create User's Volatile Files and Directories...
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Listening on D-Bus User Message Bus Socket.
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Reached target Sockets.
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Finished Create User's Volatile Files and Directories.
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Reached target Basic System.
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Reached target Main User Target.
Feb 27 16:19:18 np0005633116.novalocal systemd[4792]: Startup finished in 139ms.
Feb 27 16:19:18 np0005633116.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 27 16:19:18 np0005633116.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 27 16:19:18 np0005633116.novalocal sshd-session[4788]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:19:19 np0005633116.novalocal python3[4874]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:19:21 np0005633116.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 27 16:19:21 np0005633116.novalocal python3[4902]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:19:28 np0005633116.novalocal python3[4962]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:19:29 np0005633116.novalocal python3[5002]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 27 16:19:31 np0005633116.novalocal python3[5028]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDBP47mNHfBDTJ/G9DegKAiVTw1NXAZOMpH/p2qIpljyF3Y5mFiKkJ0Aba/UThWPzpmq0Qeon8ri7NczVB+rtmcCKMiiH4gWFFiULeSrAk/rkaDxoxTjXRaC0QC5PgVYGTeltyYwSBz0PT4Ecbl/2WQ+2ttBANTiQMQqVVm9xMaukNB1HVzNdiS97uykyCIhxAAtc/WRP+8BmjOR68ATXl1ukDB610S+dsoGFdpvVLXXQHhS8XvzeSC/cF2uHCnS+Epm7XxJoLT/txNBSqDU2nQhSC5lQzBIko9YdXR3faTTHxn2h6G+pImGe/lIFJq7mv2rr7XXGWiAP4UXYxkpRMv5qb5YoQ8I+rEy4SNHtIwbKJIVr/Xi0LiBvDAkDUSiP/9oIhEuZmYc9ZCp4H4HXU2GSeEPDh83EzWyvT3TeJkj2RqHJT4vcdWsWcN8Be/8LHNCeSfoTBx4yDNB48bv/OQkO9T841pWShnzmxq387xmSax/1hsWgaF+2W+o97njsk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:31 np0005633116.novalocal python3[5052]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:32 np0005633116.novalocal python3[5151]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:19:32 np0005633116.novalocal python3[5222]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772209171.9438403-207-247603461735937/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=171cba308a1a432d98a7e7fbb3ee3907_id_rsa follow=False checksum=f6e2c4213d32c17afcbbb9f5dd286e31c1546252 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:33 np0005633116.novalocal python3[5345]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:19:33 np0005633116.novalocal python3[5416]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772209172.992117-240-99438385284682/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=171cba308a1a432d98a7e7fbb3ee3907_id_rsa.pub follow=False checksum=b42eadc45fbcfdd70590f29545bb2396356abedd backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:35 np0005633116.novalocal python3[5464]: ansible-ping Invoked with data=pong
Feb 27 16:19:36 np0005633116.novalocal python3[5488]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:19:37 np0005633116.novalocal python3[5546]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 27 16:19:39 np0005633116.novalocal python3[5578]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:39 np0005633116.novalocal python3[5602]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:39 np0005633116.novalocal python3[5626]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:39 np0005633116.novalocal python3[5650]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:40 np0005633116.novalocal python3[5674]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:40 np0005633116.novalocal python3[5698]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:41 np0005633116.novalocal sudo[5722]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xicltugewgfxgbvokieavarohnhrtdnp ; /usr/bin/python3'
Feb 27 16:19:41 np0005633116.novalocal sudo[5722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:19:41 np0005633116.novalocal python3[5724]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:41 np0005633116.novalocal sudo[5722]: pam_unix(sudo:session): session closed for user root
Feb 27 16:19:42 np0005633116.novalocal sudo[5800]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvtkvrqxejcwafdkewbfhgzepacjuvtz ; /usr/bin/python3'
Feb 27 16:19:42 np0005633116.novalocal sudo[5800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:19:42 np0005633116.novalocal python3[5802]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:19:42 np0005633116.novalocal sudo[5800]: pam_unix(sudo:session): session closed for user root
Feb 27 16:19:42 np0005633116.novalocal sudo[5873]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpovjdmixklnnzmynoxjibmbhedglvtc ; /usr/bin/python3'
Feb 27 16:19:42 np0005633116.novalocal sudo[5873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:19:42 np0005633116.novalocal python3[5875]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1772209182.1614914-21-88425881744871/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:43 np0005633116.novalocal sudo[5873]: pam_unix(sudo:session): session closed for user root
Feb 27 16:19:43 np0005633116.novalocal python3[5923]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:43 np0005633116.novalocal python3[5947]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:44 np0005633116.novalocal python3[5971]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:44 np0005633116.novalocal python3[5995]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:44 np0005633116.novalocal python3[6019]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:44 np0005633116.novalocal python3[6043]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:45 np0005633116.novalocal python3[6067]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:45 np0005633116.novalocal python3[6091]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:45 np0005633116.novalocal python3[6115]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:45 np0005633116.novalocal python3[6139]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:46 np0005633116.novalocal python3[6163]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:46 np0005633116.novalocal python3[6187]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:46 np0005633116.novalocal python3[6211]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:46 np0005633116.novalocal python3[6235]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:47 np0005633116.novalocal python3[6259]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:47 np0005633116.novalocal python3[6283]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:47 np0005633116.novalocal python3[6307]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:47 np0005633116.novalocal python3[6331]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:48 np0005633116.novalocal python3[6355]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:48 np0005633116.novalocal python3[6379]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:48 np0005633116.novalocal python3[6403]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:49 np0005633116.novalocal python3[6427]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:49 np0005633116.novalocal python3[6451]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:49 np0005633116.novalocal python3[6475]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:50 np0005633116.novalocal python3[6499]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:50 np0005633116.novalocal python3[6523]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:19:50 np0005633116.novalocal irqbalance[796]: Cannot change IRQ 26 affinity: Operation not permitted
Feb 27 16:19:50 np0005633116.novalocal irqbalance[796]: IRQ 26 affinity is now unmanaged
Feb 27 16:19:53 np0005633116.novalocal sudo[6547]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wznproxwykqgbccpdzmddvhtopakmqog ; /usr/bin/python3'
Feb 27 16:19:53 np0005633116.novalocal sudo[6547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:19:53 np0005633116.novalocal python3[6549]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 27 16:19:53 np0005633116.novalocal systemd[1]: Starting Time & Date Service...
Feb 27 16:19:53 np0005633116.novalocal systemd[1]: Started Time & Date Service.
Feb 27 16:19:53 np0005633116.novalocal systemd-timedated[6551]: Changed time zone to 'UTC' (UTC).
Feb 27 16:19:53 np0005633116.novalocal sudo[6547]: pam_unix(sudo:session): session closed for user root
Feb 27 16:19:54 np0005633116.novalocal sudo[6579]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxbbdobicgekktwcpghfywrfdrjvmkkt ; /usr/bin/python3'
Feb 27 16:19:54 np0005633116.novalocal sudo[6579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:19:54 np0005633116.novalocal python3[6581]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:54 np0005633116.novalocal sudo[6579]: pam_unix(sudo:session): session closed for user root
Feb 27 16:19:55 np0005633116.novalocal python3[6657]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:19:55 np0005633116.novalocal python3[6728]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1772209194.7840395-153-42353772681463/source _original_basename=tmpls0rpipx follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:55 np0005633116.novalocal python3[6828]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:19:56 np0005633116.novalocal python3[6899]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1772209195.6260831-183-205735098297117/source _original_basename=tmprlrzhsaz follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:56 np0005633116.novalocal sudo[6999]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfwvfvexolubhxuigxjhnezmktifjkzp ; /usr/bin/python3'
Feb 27 16:19:56 np0005633116.novalocal sudo[6999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:19:56 np0005633116.novalocal python3[7001]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:19:56 np0005633116.novalocal sudo[6999]: pam_unix(sudo:session): session closed for user root
Feb 27 16:19:57 np0005633116.novalocal sudo[7072]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjsyhjxdzicovkkbpzknbakbhknbgvjp ; /usr/bin/python3'
Feb 27 16:19:57 np0005633116.novalocal sudo[7072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:19:57 np0005633116.novalocal python3[7074]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1772209196.6725967-231-66646069128751/source _original_basename=tmp6bo39gsw follow=False checksum=5ba162e0e6cfa2f970ec11e4fcc71ca2518fbed1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:57 np0005633116.novalocal sudo[7072]: pam_unix(sudo:session): session closed for user root
Feb 27 16:19:58 np0005633116.novalocal python3[7122]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:19:58 np0005633116.novalocal python3[7148]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:19:58 np0005633116.novalocal sudo[7226]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guvttpsipslpdufrxbwppfilpsnstmdi ; /usr/bin/python3'
Feb 27 16:19:58 np0005633116.novalocal sudo[7226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:19:58 np0005633116.novalocal python3[7228]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:19:58 np0005633116.novalocal sudo[7226]: pam_unix(sudo:session): session closed for user root
Feb 27 16:19:59 np0005633116.novalocal sudo[7299]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmjnacocjmxbbvffwgknzrrepaymjsgw ; /usr/bin/python3'
Feb 27 16:19:59 np0005633116.novalocal sudo[7299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:19:59 np0005633116.novalocal python3[7301]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1772209198.5378745-273-273285282066106/source _original_basename=tmpet33mw6s follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:19:59 np0005633116.novalocal sudo[7299]: pam_unix(sudo:session): session closed for user root
Feb 27 16:19:59 np0005633116.novalocal sudo[7350]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwhzhcyinhhomvfumubbpcnkbiyxncwu ; /usr/bin/python3'
Feb 27 16:19:59 np0005633116.novalocal sudo[7350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:19:59 np0005633116.novalocal python3[7352]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-496a-0b8f-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:19:59 np0005633116.novalocal sudo[7350]: pam_unix(sudo:session): session closed for user root
Feb 27 16:20:00 np0005633116.novalocal python3[7380]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-496a-0b8f-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 27 16:20:01 np0005633116.novalocal python3[7408]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:20:19 np0005633116.novalocal sudo[7432]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-carcxczqbovwrsqrqtmfgtbpbvauayoi ; /usr/bin/python3'
Feb 27 16:20:19 np0005633116.novalocal sudo[7432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:20:20 np0005633116.novalocal python3[7434]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:20:20 np0005633116.novalocal sudo[7432]: pam_unix(sudo:session): session closed for user root
Feb 27 16:20:23 np0005633116.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 27 16:20:59 np0005633116.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Feb 27 16:20:59 np0005633116.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Feb 27 16:20:59 np0005633116.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Feb 27 16:20:59 np0005633116.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Feb 27 16:20:59 np0005633116.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Feb 27 16:20:59 np0005633116.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Feb 27 16:20:59 np0005633116.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Feb 27 16:20:59 np0005633116.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Feb 27 16:20:59 np0005633116.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Feb 27 16:20:59 np0005633116.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 27 16:20:59 np0005633116.novalocal NetworkManager[865]: <info>  [1772209259.4500] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 27 16:20:59 np0005633116.novalocal systemd-udevd[7437]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 16:20:59 np0005633116.novalocal NetworkManager[865]: <info>  [1772209259.4744] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:20:59 np0005633116.novalocal NetworkManager[865]: <info>  [1772209259.4768] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 27 16:20:59 np0005633116.novalocal NetworkManager[865]: <info>  [1772209259.4771] device (eth1): carrier: link connected
Feb 27 16:20:59 np0005633116.novalocal NetworkManager[865]: <info>  [1772209259.4773] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Feb 27 16:20:59 np0005633116.novalocal NetworkManager[865]: <info>  [1772209259.4778] policy: auto-activating connection 'Wired connection 1' (281fa0e3-e3e5-328a-b833-575ca1dafd63)
Feb 27 16:20:59 np0005633116.novalocal NetworkManager[865]: <info>  [1772209259.4782] device (eth1): Activation: starting connection 'Wired connection 1' (281fa0e3-e3e5-328a-b833-575ca1dafd63)
Feb 27 16:20:59 np0005633116.novalocal NetworkManager[865]: <info>  [1772209259.4783] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:20:59 np0005633116.novalocal NetworkManager[865]: <info>  [1772209259.4787] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:20:59 np0005633116.novalocal NetworkManager[865]: <info>  [1772209259.4790] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:20:59 np0005633116.novalocal NetworkManager[865]: <info>  [1772209259.4794] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 27 16:21:00 np0005633116.novalocal python3[7464]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-d866-b07b-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:21:07 np0005633116.novalocal sudo[7542]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okrkxrutanraktpfmsyptwipluapahbr ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 27 16:21:07 np0005633116.novalocal sudo[7542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:21:07 np0005633116.novalocal python3[7544]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:21:07 np0005633116.novalocal sudo[7542]: pam_unix(sudo:session): session closed for user root
Feb 27 16:21:07 np0005633116.novalocal sudo[7615]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgfjgsjlpqkrcjtwvluwleegtjixvwcd ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 27 16:21:07 np0005633116.novalocal sudo[7615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:21:07 np0005633116.novalocal python3[7617]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772209266.9092367-102-122321560496988/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=339ce622e62aa8cad6aa3dde7b64c8c10ff54375 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:21:07 np0005633116.novalocal sudo[7615]: pam_unix(sudo:session): session closed for user root
Feb 27 16:21:08 np0005633116.novalocal sudo[7665]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkyukyokeshqpcpdnkosnkdleshnigkx ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 27 16:21:08 np0005633116.novalocal sudo[7665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:21:08 np0005633116.novalocal python3[7667]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: Stopping Network Manager...
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[865]: <info>  [1772209268.2574] caught SIGTERM, shutting down normally.
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[865]: <info>  [1772209268.2584] dhcp4 (eth0): canceled DHCP transaction
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[865]: <info>  [1772209268.2584] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[865]: <info>  [1772209268.2584] dhcp4 (eth0): state changed no lease
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[865]: <info>  [1772209268.2587] manager: NetworkManager state is now CONNECTING
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[865]: <info>  [1772209268.2672] dhcp4 (eth1): canceled DHCP transaction
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[865]: <info>  [1772209268.2673] dhcp4 (eth1): state changed no lease
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[865]: <info>  [1772209268.2722] exiting (success)
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: Stopped Network Manager.
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: Starting Network Manager...
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.3235] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:7134621a-8b85-4cf7-b630-8b1bd86c0689)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.3237] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.3276] manager[0x55c4eb033000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: Starting Hostname Service...
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: Started Hostname Service.
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4161] hostname: hostname: using hostnamed
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4164] hostname: static hostname changed from (none) to "np0005633116.novalocal"
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4169] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4175] manager[0x55c4eb033000]: rfkill: Wi-Fi hardware radio set enabled
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4175] manager[0x55c4eb033000]: rfkill: WWAN hardware radio set enabled
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4208] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4208] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4209] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4210] manager: Networking is enabled by state file
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4212] settings: Loaded settings plugin: keyfile (internal)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4217] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4247] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4257] dhcp: init: Using DHCP client 'internal'
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4261] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4268] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4275] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4283] device (lo): Activation: starting connection 'lo' (64e865f2-9b77-47c6-8998-ed75b7b9b4c4)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4289] device (eth0): carrier: link connected
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4293] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4297] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4298] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4303] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4309] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4314] device (eth1): carrier: link connected
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4318] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4323] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (281fa0e3-e3e5-328a-b833-575ca1dafd63) (indicated)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4323] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4327] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4334] device (eth1): Activation: starting connection 'Wired connection 1' (281fa0e3-e3e5-328a-b833-575ca1dafd63)
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: Started Network Manager.
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4340] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4344] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4346] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4348] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4350] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4353] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4356] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4359] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4362] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4370] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4374] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4382] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4384] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4396] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4401] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 27 16:21:08 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209268.4406] device (lo): Activation: successful, device activated.
Feb 27 16:21:08 np0005633116.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 27 16:21:08 np0005633116.novalocal sudo[7665]: pam_unix(sudo:session): session closed for user root
Feb 27 16:21:08 np0005633116.novalocal python3[7732]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-d866-b07b-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:21:09 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209269.6765] dhcp4 (eth0): state changed new lease, address=38.129.56.53
Feb 27 16:21:09 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209269.6775] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 27 16:21:09 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209269.6846] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 27 16:21:09 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209269.6869] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 27 16:21:09 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209269.6871] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 27 16:21:09 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209269.6875] manager: NetworkManager state is now CONNECTED_SITE
Feb 27 16:21:09 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209269.6877] device (eth0): Activation: successful, device activated.
Feb 27 16:21:09 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209269.6880] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 27 16:21:19 np0005633116.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 27 16:21:38 np0005633116.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 27 16:21:43 np0005633116.novalocal systemd[4792]: Starting Mark boot as successful...
Feb 27 16:21:43 np0005633116.novalocal systemd[4792]: Finished Mark boot as successful.
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4555] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 27 16:21:53 np0005633116.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 27 16:21:53 np0005633116.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4835] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4837] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4842] device (eth1): Activation: successful, device activated.
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4847] manager: startup complete
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4849] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <warn>  [1772209313.4852] device (eth1): Activation: failed for connection 'Wired connection 1'
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4858] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Feb 27 16:21:53 np0005633116.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4942] dhcp4 (eth1): canceled DHCP transaction
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4943] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4943] dhcp4 (eth1): state changed no lease
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4953] policy: auto-activating connection 'ci-private-network' (4c61635a-3137-500c-b3cf-3399f56581ba)
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4956] device (eth1): Activation: starting connection 'ci-private-network' (4c61635a-3137-500c-b3cf-3399f56581ba)
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4957] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4959] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4964] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.4970] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.5013] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.5015] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:21:53 np0005633116.novalocal NetworkManager[7681]: <info>  [1772209313.5020] device (eth1): Activation: successful, device activated.
Feb 27 16:22:03 np0005633116.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 27 16:22:05 np0005633116.novalocal sudo[7855]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrytpzxtnhczrsxkmgdgfnmwfyicornd ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 27 16:22:05 np0005633116.novalocal sudo[7855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:22:05 np0005633116.novalocal python3[7857]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:22:05 np0005633116.novalocal sudo[7855]: pam_unix(sudo:session): session closed for user root
Feb 27 16:22:05 np0005633116.novalocal sudo[7928]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fboqjkbksbrbdmseqamdarfoygfhixhi ; OS_CLOUD=vexxhost /usr/bin/python3'
Feb 27 16:22:05 np0005633116.novalocal sudo[7928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:22:05 np0005633116.novalocal python3[7930]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772209325.358159-259-65594105005313/source _original_basename=tmp4ts410sv follow=False checksum=0caff45fdf895fa2ba77939b196bf1ae5da00f03 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:22:05 np0005633116.novalocal sudo[7928]: pam_unix(sudo:session): session closed for user root
Feb 27 16:23:05 np0005633116.novalocal sshd-session[4801]: Received disconnect from 38.102.83.114 port 57870:11: disconnected by user
Feb 27 16:23:05 np0005633116.novalocal sshd-session[4801]: Disconnected from user zuul 38.102.83.114 port 57870
Feb 27 16:23:05 np0005633116.novalocal sshd-session[4788]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:23:05 np0005633116.novalocal systemd-logind[803]: Session 1 logged out. Waiting for processes to exit.
Feb 27 16:24:43 np0005633116.novalocal systemd[4792]: Created slice User Background Tasks Slice.
Feb 27 16:24:43 np0005633116.novalocal systemd[4792]: Starting Cleanup of User's Temporary Files and Directories...
Feb 27 16:24:43 np0005633116.novalocal systemd[4792]: Finished Cleanup of User's Temporary Files and Directories.
Feb 27 16:28:43 np0005633116.novalocal sshd-session[7959]: Invalid user admin from 101.36.123.102 port 45102
Feb 27 16:28:43 np0005633116.novalocal sshd-session[7959]: Connection closed by invalid user admin 101.36.123.102 port 45102 [preauth]
Feb 27 16:31:12 np0005633116.novalocal sshd-session[7963]: Accepted publickey for zuul from 38.102.83.114 port 49886 ssh2: RSA SHA256:9nr1WnrycB1u+Slj9uMvvEqsBvLD0JIi4SFis5V76CI
Feb 27 16:31:12 np0005633116.novalocal systemd-logind[803]: New session 3 of user zuul.
Feb 27 16:31:12 np0005633116.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 27 16:31:12 np0005633116.novalocal sshd-session[7963]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:31:12 np0005633116.novalocal sudo[7990]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfjarrzlqjffxzunnzjqfnzeolvyhpcg ; /usr/bin/python3'
Feb 27 16:31:12 np0005633116.novalocal sudo[7990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:12 np0005633116.novalocal python3[7992]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-e2b5-5aa3-0000000021be-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:31:12 np0005633116.novalocal sudo[7990]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:12 np0005633116.novalocal sudo[8019]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npfexydxicuqviczcgwlssopxvdueucb ; /usr/bin/python3'
Feb 27 16:31:12 np0005633116.novalocal sudo[8019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:12 np0005633116.novalocal python3[8021]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:31:12 np0005633116.novalocal sudo[8019]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:12 np0005633116.novalocal sudo[8045]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pffskkzysdckatbmqbvmgyopdueajbgu ; /usr/bin/python3'
Feb 27 16:31:12 np0005633116.novalocal sudo[8045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:13 np0005633116.novalocal python3[8047]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:31:13 np0005633116.novalocal sudo[8045]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:13 np0005633116.novalocal sudo[8071]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olwcmlbhqdixxmgloeqsdpulklummmkl ; /usr/bin/python3'
Feb 27 16:31:13 np0005633116.novalocal sudo[8071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:13 np0005633116.novalocal python3[8073]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:31:13 np0005633116.novalocal sudo[8071]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:13 np0005633116.novalocal sudo[8097]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvoxtdicewusoidfmraakckpvohonomo ; /usr/bin/python3'
Feb 27 16:31:13 np0005633116.novalocal sudo[8097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:13 np0005633116.novalocal python3[8099]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:31:13 np0005633116.novalocal sudo[8097]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:14 np0005633116.novalocal sudo[8123]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wczcupjybxmsdwswrgcgoouocbawupsx ; /usr/bin/python3'
Feb 27 16:31:14 np0005633116.novalocal sudo[8123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:14 np0005633116.novalocal python3[8125]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:31:14 np0005633116.novalocal sudo[8123]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:14 np0005633116.novalocal sudo[8201]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awzaakctamhxnpukicksfdrnzidteipr ; /usr/bin/python3'
Feb 27 16:31:14 np0005633116.novalocal sudo[8201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:14 np0005633116.novalocal python3[8203]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:31:14 np0005633116.novalocal sudo[8201]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:14 np0005633116.novalocal sudo[8274]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ursiiyqlqmlsqbpsrkgeitnqoeuvoifg ; /usr/bin/python3'
Feb 27 16:31:14 np0005633116.novalocal sudo[8274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:15 np0005633116.novalocal python3[8276]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772209874.547193-521-239085177091514/source _original_basename=tmphdth4gjo follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:31:15 np0005633116.novalocal sudo[8274]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:15 np0005633116.novalocal sudo[8324]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhyvnijlxridwvurzcmytwzdgfxaauda ; /usr/bin/python3'
Feb 27 16:31:15 np0005633116.novalocal sudo[8324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:16 np0005633116.novalocal python3[8326]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 16:31:16 np0005633116.novalocal systemd[1]: Reloading.
Feb 27 16:31:16 np0005633116.novalocal systemd-rc-local-generator[8343]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:31:16 np0005633116.novalocal sudo[8324]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:17 np0005633116.novalocal sudo[8386]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evgjwdpmzgvpauqftmmhnfaoahgwbhoo ; /usr/bin/python3'
Feb 27 16:31:17 np0005633116.novalocal sudo[8386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:17 np0005633116.novalocal python3[8388]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 27 16:31:17 np0005633116.novalocal sudo[8386]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:18 np0005633116.novalocal sudo[8412]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmmbyvkefjrluqiqtdjbrykkyylipspk ; /usr/bin/python3'
Feb 27 16:31:18 np0005633116.novalocal sudo[8412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:18 np0005633116.novalocal python3[8414]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:31:18 np0005633116.novalocal sudo[8412]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:18 np0005633116.novalocal sudo[8440]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sueemavssovxvcludmgyyzxqiupeobir ; /usr/bin/python3'
Feb 27 16:31:18 np0005633116.novalocal sudo[8440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:18 np0005633116.novalocal python3[8442]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:31:18 np0005633116.novalocal sudo[8440]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:18 np0005633116.novalocal sudo[8468]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvjmoqpzlexphjfpvktaogyvitkghfvd ; /usr/bin/python3'
Feb 27 16:31:18 np0005633116.novalocal sudo[8468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:18 np0005633116.novalocal python3[8470]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:31:18 np0005633116.novalocal sudo[8468]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:18 np0005633116.novalocal sudo[8496]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avtngrqhwxbxynfqtjxuhvfexkokkhfw ; /usr/bin/python3'
Feb 27 16:31:18 np0005633116.novalocal sudo[8496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:18 np0005633116.novalocal python3[8498]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:31:18 np0005633116.novalocal sudo[8496]: pam_unix(sudo:session): session closed for user root
Feb 27 16:31:19 np0005633116.novalocal python3[8525]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163efc-24cc-e2b5-5aa3-0000000021c5-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:31:20 np0005633116.novalocal python3[8555]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 27 16:31:20 np0005633116.novalocal irqbalance[796]: Cannot change IRQ 27 affinity: Operation not permitted
Feb 27 16:31:20 np0005633116.novalocal irqbalance[796]: IRQ 27 affinity is now unmanaged
Feb 27 16:31:22 np0005633116.novalocal sshd-session[7966]: Connection closed by 38.102.83.114 port 49886
Feb 27 16:31:22 np0005633116.novalocal sshd-session[7963]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:31:22 np0005633116.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 27 16:31:22 np0005633116.novalocal systemd[1]: session-3.scope: Consumed 3.799s CPU time.
Feb 27 16:31:22 np0005633116.novalocal systemd-logind[803]: Session 3 logged out. Waiting for processes to exit.
Feb 27 16:31:22 np0005633116.novalocal systemd-logind[803]: Removed session 3.
Feb 27 16:31:24 np0005633116.novalocal sshd-session[8563]: Accepted publickey for zuul from 38.102.83.114 port 46270 ssh2: RSA SHA256:9nr1WnrycB1u+Slj9uMvvEqsBvLD0JIi4SFis5V76CI
Feb 27 16:31:24 np0005633116.novalocal systemd-logind[803]: New session 4 of user zuul.
Feb 27 16:31:24 np0005633116.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 27 16:31:24 np0005633116.novalocal sshd-session[8563]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:31:24 np0005633116.novalocal sudo[8590]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urgslipmvgstpknmberlbcreihqqmqup ; /usr/bin/python3'
Feb 27 16:31:24 np0005633116.novalocal sudo[8590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:31:24 np0005633116.novalocal python3[8592]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 27 16:31:30 np0005633116.novalocal setsebool[8627]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 27 16:31:30 np0005633116.novalocal setsebool[8627]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 27 16:31:41 np0005633116.novalocal kernel: SELinux:  Converting 385 SID table entries...
Feb 27 16:31:41 np0005633116.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 27 16:31:41 np0005633116.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 27 16:31:41 np0005633116.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 27 16:31:41 np0005633116.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 27 16:31:41 np0005633116.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 27 16:31:41 np0005633116.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 27 16:31:41 np0005633116.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 27 16:31:50 np0005633116.novalocal kernel: SELinux:  Converting 388 SID table entries...
Feb 27 16:31:50 np0005633116.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 27 16:31:50 np0005633116.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 27 16:31:50 np0005633116.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 27 16:31:50 np0005633116.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 27 16:31:50 np0005633116.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 27 16:31:50 np0005633116.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 27 16:31:50 np0005633116.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 27 16:32:08 np0005633116.novalocal dbus-broker-launch[787]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 27 16:32:08 np0005633116.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 27 16:32:08 np0005633116.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 27 16:32:08 np0005633116.novalocal systemd[1]: Reloading.
Feb 27 16:32:08 np0005633116.novalocal systemd-rc-local-generator[9416]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:32:08 np0005633116.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 27 16:32:09 np0005633116.novalocal sudo[8590]: pam_unix(sudo:session): session closed for user root
Feb 27 16:32:11 np0005633116.novalocal python3[12686]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163efc-24cc-9bfe-ec88-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:32:12 np0005633116.novalocal kernel: evm: overlay not supported
Feb 27 16:32:12 np0005633116.novalocal systemd[4792]: Starting D-Bus User Message Bus...
Feb 27 16:32:12 np0005633116.novalocal dbus-broker-launch[13667]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 27 16:32:12 np0005633116.novalocal dbus-broker-launch[13667]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 27 16:32:12 np0005633116.novalocal systemd[4792]: Started D-Bus User Message Bus.
Feb 27 16:32:12 np0005633116.novalocal dbus-broker-lau[13667]: Ready
Feb 27 16:32:12 np0005633116.novalocal systemd[4792]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 27 16:32:12 np0005633116.novalocal systemd[4792]: Created slice Slice /user.
Feb 27 16:32:12 np0005633116.novalocal systemd[4792]: podman-13509.scope: unit configures an IP firewall, but not running as root.
Feb 27 16:32:12 np0005633116.novalocal systemd[4792]: (This warning is only shown for the first unit using IP firewalling.)
Feb 27 16:32:12 np0005633116.novalocal systemd[4792]: Started podman-13509.scope.
Feb 27 16:32:12 np0005633116.novalocal systemd[4792]: Started podman-pause-66e6fb62.scope.
Feb 27 16:32:13 np0005633116.novalocal sudo[14483]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwqwsfiqvpvqzqlnwcckkspfedhwxueg ; /usr/bin/python3'
Feb 27 16:32:13 np0005633116.novalocal sudo[14483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:32:13 np0005633116.novalocal python3[14504]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.129.56.240:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.129.56.240:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:32:13 np0005633116.novalocal python3[14504]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Feb 27 16:32:13 np0005633116.novalocal sudo[14483]: pam_unix(sudo:session): session closed for user root
Feb 27 16:32:13 np0005633116.novalocal sshd-session[8566]: Connection closed by 38.102.83.114 port 46270
Feb 27 16:32:13 np0005633116.novalocal sshd-session[8563]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:32:13 np0005633116.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 27 16:32:13 np0005633116.novalocal systemd[1]: session-4.scope: Consumed 40.500s CPU time.
Feb 27 16:32:13 np0005633116.novalocal systemd-logind[803]: Session 4 logged out. Waiting for processes to exit.
Feb 27 16:32:13 np0005633116.novalocal systemd-logind[803]: Removed session 4.
Feb 27 16:32:34 np0005633116.novalocal sshd-session[25438]: Unable to negotiate with 38.102.83.159 port 38084: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 27 16:32:34 np0005633116.novalocal sshd-session[25435]: Connection closed by 38.102.83.159 port 38058 [preauth]
Feb 27 16:32:34 np0005633116.novalocal sshd-session[25440]: Connection closed by 38.102.83.159 port 38066 [preauth]
Feb 27 16:32:34 np0005633116.novalocal sshd-session[25441]: Unable to negotiate with 38.102.83.159 port 38080: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 27 16:32:34 np0005633116.novalocal sshd-session[25443]: Unable to negotiate with 38.102.83.159 port 38086: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 27 16:32:38 np0005633116.novalocal sshd-session[27610]: Accepted publickey for zuul from 38.102.83.114 port 33204 ssh2: RSA SHA256:9nr1WnrycB1u+Slj9uMvvEqsBvLD0JIi4SFis5V76CI
Feb 27 16:32:38 np0005633116.novalocal systemd-logind[803]: New session 5 of user zuul.
Feb 27 16:32:38 np0005633116.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 27 16:32:38 np0005633116.novalocal sshd-session[27610]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:32:38 np0005633116.novalocal python3[27723]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKWBA7QErlZn5gY6CCIHobPk1ol+XKx7rAOyyNPObvDSJBNKTa9SP6lIamA8p9/UbVhJhz2qr2flzXfq/Gfjclk= zuul@np0005633115.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:32:38 np0005633116.novalocal sudo[27995]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nykaywyxusphvnafmozetyxvcqlaxidz ; /usr/bin/python3'
Feb 27 16:32:38 np0005633116.novalocal sudo[27995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:32:39 np0005633116.novalocal python3[28004]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKWBA7QErlZn5gY6CCIHobPk1ol+XKx7rAOyyNPObvDSJBNKTa9SP6lIamA8p9/UbVhJhz2qr2flzXfq/Gfjclk= zuul@np0005633115.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:32:39 np0005633116.novalocal sudo[27995]: pam_unix(sudo:session): session closed for user root
Feb 27 16:32:39 np0005633116.novalocal sudo[28375]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzdmvsenwpzcykywdxgjlreruqmnfkmd ; /usr/bin/python3'
Feb 27 16:32:39 np0005633116.novalocal sudo[28375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:32:39 np0005633116.novalocal python3[28388]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005633116.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 27 16:32:39 np0005633116.novalocal useradd[28468]: new group: name=cloud-admin, GID=1002
Feb 27 16:32:39 np0005633116.novalocal useradd[28468]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Feb 27 16:32:40 np0005633116.novalocal sudo[28375]: pam_unix(sudo:session): session closed for user root
Feb 27 16:32:40 np0005633116.novalocal sudo[28615]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnvzuwhpqwvktfciowmhscuvciksmyxf ; /usr/bin/python3'
Feb 27 16:32:40 np0005633116.novalocal sudo[28615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:32:40 np0005633116.novalocal python3[28627]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKWBA7QErlZn5gY6CCIHobPk1ol+XKx7rAOyyNPObvDSJBNKTa9SP6lIamA8p9/UbVhJhz2qr2flzXfq/Gfjclk= zuul@np0005633115.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 27 16:32:40 np0005633116.novalocal sudo[28615]: pam_unix(sudo:session): session closed for user root
Feb 27 16:32:40 np0005633116.novalocal sudo[28907]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owwlumngpgvnxbslyuvqlmicmmybhdnr ; /usr/bin/python3'
Feb 27 16:32:40 np0005633116.novalocal sudo[28907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:32:40 np0005633116.novalocal python3[28919]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:32:40 np0005633116.novalocal sudo[28907]: pam_unix(sudo:session): session closed for user root
Feb 27 16:32:41 np0005633116.novalocal sudo[29197]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntbrbijvlamsgwnmnzdylepuhgutnpek ; /usr/bin/python3'
Feb 27 16:32:41 np0005633116.novalocal sudo[29197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:32:41 np0005633116.novalocal python3[29207]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1772209960.5210948-139-238938322755076/source _original_basename=tmp7mhwprf6 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:32:41 np0005633116.novalocal sudo[29197]: pam_unix(sudo:session): session closed for user root
Feb 27 16:32:41 np0005633116.novalocal sudo[29552]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhxdmzrdgzohdwqthfidkfvkdltasyqq ; /usr/bin/python3'
Feb 27 16:32:41 np0005633116.novalocal sudo[29552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:32:42 np0005633116.novalocal python3[29562]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Feb 27 16:32:42 np0005633116.novalocal systemd[1]: Starting Hostname Service...
Feb 27 16:32:42 np0005633116.novalocal systemd[1]: Started Hostname Service.
Feb 27 16:32:42 np0005633116.novalocal systemd-hostnamed[29698]: Changed pretty hostname to 'compute-0'
Feb 27 16:32:42 compute-0 systemd-hostnamed[29698]: Hostname set to <compute-0> (static)
Feb 27 16:32:42 compute-0 NetworkManager[7681]: <info>  [1772209962.2441] hostname: static hostname changed from "np0005633116.novalocal" to "compute-0"
Feb 27 16:32:42 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 27 16:32:42 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 27 16:32:42 compute-0 sudo[29552]: pam_unix(sudo:session): session closed for user root
Feb 27 16:32:42 compute-0 sshd-session[27667]: Connection closed by 38.102.83.114 port 33204
Feb 27 16:32:42 compute-0 sshd-session[27610]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:32:42 compute-0 systemd[1]: session-5.scope: Deactivated successfully.
Feb 27 16:32:42 compute-0 systemd[1]: session-5.scope: Consumed 2.235s CPU time.
Feb 27 16:32:42 compute-0 systemd-logind[803]: Session 5 logged out. Waiting for processes to exit.
Feb 27 16:32:42 compute-0 systemd-logind[803]: Removed session 5.
Feb 27 16:32:43 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 27 16:32:43 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 27 16:32:43 compute-0 systemd[1]: man-db-cache-update.service: Consumed 40.915s CPU time.
Feb 27 16:32:43 compute-0 systemd[1]: run-r6dc6518907e940aeb00f86ba6dd261f8.service: Deactivated successfully.
Feb 27 16:32:52 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 27 16:33:12 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 27 16:34:33 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Feb 27 16:34:33 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 27 16:34:33 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Feb 27 16:34:33 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 27 16:35:01 compute-0 sshd-session[30532]: Connection closed by 45.148.10.240 port 57846
Feb 27 16:36:41 compute-0 sshd-session[30534]: Accepted publickey for zuul from 38.102.83.159 port 53310 ssh2: RSA SHA256:9nr1WnrycB1u+Slj9uMvvEqsBvLD0JIi4SFis5V76CI
Feb 27 16:36:41 compute-0 systemd-logind[803]: New session 6 of user zuul.
Feb 27 16:36:41 compute-0 systemd[1]: Started Session 6 of User zuul.
Feb 27 16:36:41 compute-0 sshd-session[30534]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:36:42 compute-0 python3[30610]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:36:43 compute-0 sudo[30724]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyunttmazxwpkfepsgbaskmekvnimzew ; /usr/bin/python3'
Feb 27 16:36:43 compute-0 sudo[30724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:44 compute-0 python3[30726]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:36:44 compute-0 sudo[30724]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:44 compute-0 sudo[30797]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taikuqvbzuvfzfxbwegjaqhrmwqcpcet ; /usr/bin/python3'
Feb 27 16:36:44 compute-0 sudo[30797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:44 compute-0 python3[30799]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772210203.737054-34187-122753029296364/source mode=0755 _original_basename=delorean.repo follow=False checksum=c7624fe5e858d4139de1ac159778eb6fd097c2ca backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:36:44 compute-0 sudo[30797]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:44 compute-0 sudo[30823]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvgtomiajtzlgriectsphrvpkrvuoesz ; /usr/bin/python3'
Feb 27 16:36:44 compute-0 sudo[30823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:44 compute-0 python3[30825]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:36:44 compute-0 sudo[30823]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:44 compute-0 sudo[30896]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrghwiedbapehxortdvnbgpsmvktkftd ; /usr/bin/python3'
Feb 27 16:36:44 compute-0 sudo[30896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:45 compute-0 python3[30898]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772210203.737054-34187-122753029296364/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:36:45 compute-0 sudo[30896]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:45 compute-0 sudo[30922]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdqfzmxujqphfndmjadbtouqughcrtvt ; /usr/bin/python3'
Feb 27 16:36:45 compute-0 sudo[30922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:45 compute-0 python3[30924]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:36:45 compute-0 sudo[30922]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:45 compute-0 sudo[30995]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wemeoqvobmeetjonhakokaxvfdvvbbzy ; /usr/bin/python3'
Feb 27 16:36:45 compute-0 sudo[30995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:45 compute-0 python3[30997]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772210203.737054-34187-122753029296364/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:36:45 compute-0 sudo[30995]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:45 compute-0 sudo[31021]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhrzwdroldisopjqupgokxdtlydqylkw ; /usr/bin/python3'
Feb 27 16:36:45 compute-0 sudo[31021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:46 compute-0 python3[31023]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:36:46 compute-0 sudo[31021]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:46 compute-0 sudo[31094]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slvvrvdwoqabmkbbkcowsfwxyjwfbuvn ; /usr/bin/python3'
Feb 27 16:36:46 compute-0 sudo[31094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:46 compute-0 python3[31096]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772210203.737054-34187-122753029296364/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:36:46 compute-0 sudo[31094]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:46 compute-0 sudo[31120]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuxyokwhpqiidvtqfnmucbnxqqupfosc ; /usr/bin/python3'
Feb 27 16:36:46 compute-0 sudo[31120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:46 compute-0 python3[31122]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:36:46 compute-0 sudo[31120]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:46 compute-0 sudo[31193]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usnuqlszvmrlekaqremjotovvxtxclck ; /usr/bin/python3'
Feb 27 16:36:46 compute-0 sudo[31193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:47 compute-0 python3[31195]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772210203.737054-34187-122753029296364/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:36:47 compute-0 sudo[31193]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:47 compute-0 sudo[31219]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eprllkpeyrwejxtsqqlqgsucmjpztunp ; /usr/bin/python3'
Feb 27 16:36:47 compute-0 sudo[31219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:47 compute-0 python3[31221]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:36:47 compute-0 sudo[31219]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:47 compute-0 sudo[31292]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lidesubwgimzdtebpdjyflcdcdeflgqm ; /usr/bin/python3'
Feb 27 16:36:47 compute-0 sudo[31292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:47 compute-0 python3[31294]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772210203.737054-34187-122753029296364/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:36:47 compute-0 sudo[31292]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:47 compute-0 sudo[31318]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibuvgjcrrvoawrgfvzhiuryjwnahqslg ; /usr/bin/python3'
Feb 27 16:36:47 compute-0 sudo[31318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:48 compute-0 python3[31320]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 27 16:36:48 compute-0 sudo[31318]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:48 compute-0 sudo[31391]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feeaosqcnwqdkjycsteevvgleqilzeco ; /usr/bin/python3'
Feb 27 16:36:48 compute-0 sudo[31391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:36:48 compute-0 python3[31393]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1772210203.737054-34187-122753029296364/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=06a0a916cb7cbc51b08d6616a672f1322305cccf backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:36:48 compute-0 sudo[31391]: pam_unix(sudo:session): session closed for user root
Feb 27 16:36:50 compute-0 sshd-session[31418]: Unable to negotiate with 192.168.122.11 port 59886: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 27 16:36:50 compute-0 sshd-session[31419]: Connection closed by 192.168.122.11 port 59876 [preauth]
Feb 27 16:36:50 compute-0 sshd-session[31420]: Connection closed by 192.168.122.11 port 59884 [preauth]
Feb 27 16:36:50 compute-0 sshd-session[31422]: Unable to negotiate with 192.168.122.11 port 59890: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 27 16:36:50 compute-0 sshd-session[31421]: Unable to negotiate with 192.168.122.11 port 59902: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 27 16:36:59 compute-0 python3[31451]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:38:01 compute-0 sshd-session[31453]: Invalid user orangepi from 101.36.123.102 port 47118
Feb 27 16:38:02 compute-0 sshd-session[31453]: Connection closed by invalid user orangepi 101.36.123.102 port 47118 [preauth]
Feb 27 16:39:43 compute-0 systemd[1]: Starting dnf makecache...
Feb 27 16:39:43 compute-0 dnf[31456]: Failed determining last makecache time.
Feb 27 16:39:43 compute-0 dnf[31456]: delorean-openstack-barbican-42b4c41831408a8e323 359 kB/s |  13 kB     00:00
Feb 27 16:39:43 compute-0 dnf[31456]: delorean-python-glean-642fffe0203a8ffcc2443db52 3.0 MB/s |  65 kB     00:00
Feb 27 16:39:43 compute-0 dnf[31456]: delorean-openstack-cinder-e95a374f4f00ef02d562d 1.3 MB/s |  32 kB     00:00
Feb 27 16:39:43 compute-0 dnf[31456]: delorean-python-stevedore-c4acc5639fd2329372142 4.8 MB/s | 131 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-python-cloudkitty-tests-tempest-ef9563 1.5 MB/s |  32 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-diskimage-builder-cbb4478c143869181ba9  10 MB/s | 349 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-openstack-nova-5cfeecbf22fca58822607dd 1.9 MB/s |  42 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-python-designate-tests-tempest-347fdbc 870 kB/s |  18 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-openstack-glance-1fd12c29b339f30fe823e 474 kB/s |  18 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.2 MB/s |  29 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-openstack-manila-8fa2b5793100022b4d0f6 1.3 MB/s |  25 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-python-whitebox-neutron-tests-tempest- 6.7 MB/s | 153 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-openstack-octavia-76dfc1e35cf7f4dd6102 900 kB/s |  26 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-openstack-watcher-c014f81a8647287f6dcc 784 kB/s |  16 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-python-tcib-b403f1051724db0286e1418f59 341 kB/s | 7.4 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 5.4 MB/s | 144 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-openstack-swift-dc98a8463506ac520c469a 613 kB/s |  14 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-python-tempestconf-8e33668cda707818ee1 2.5 MB/s |  53 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: delorean-openstack-heat-ui-013accbfd179753bc3f0 4.1 MB/s |  96 kB     00:00
Feb 27 16:39:44 compute-0 dnf[31456]: CentOS Stream 9 - BaseOS                         65 kB/s | 7.0 kB     00:00
Feb 27 16:39:45 compute-0 dnf[31456]: CentOS Stream 9 - AppStream                      59 kB/s | 7.1 kB     00:00
Feb 27 16:39:45 compute-0 dnf[31456]: CentOS Stream 9 - CRB                            68 kB/s | 6.9 kB     00:00
Feb 27 16:39:45 compute-0 dnf[31456]: CentOS Stream 9 - Extras packages                73 kB/s | 7.6 kB     00:00
Feb 27 16:39:45 compute-0 dnf[31456]: dlrn-antelope-testing                            19 MB/s | 1.1 MB     00:00
Feb 27 16:39:45 compute-0 dnf[31456]: dlrn-antelope-build-deps                         16 MB/s | 461 kB     00:00
Feb 27 16:39:45 compute-0 dnf[31456]: centos9-rabbitmq                                6.1 MB/s | 123 kB     00:00
Feb 27 16:39:46 compute-0 dnf[31456]: centos9-storage                                  19 MB/s | 415 kB     00:00
Feb 27 16:39:46 compute-0 dnf[31456]: centos9-opstools                                4.6 MB/s |  51 kB     00:00
Feb 27 16:39:46 compute-0 dnf[31456]: NFV SIG OpenvSwitch                              22 MB/s | 465 kB     00:00
Feb 27 16:39:46 compute-0 dnf[31456]: repo-setup-centos-appstream                      94 MB/s |  27 MB     00:00
Feb 27 16:39:52 compute-0 dnf[31456]: repo-setup-centos-baseos                         67 MB/s | 8.9 MB     00:00
Feb 27 16:39:53 compute-0 dnf[31456]: repo-setup-centos-highavailability               10 MB/s | 744 kB     00:00
Feb 27 16:39:54 compute-0 dnf[31456]: repo-setup-centos-powertools                     32 MB/s | 8.0 MB     00:00
Feb 27 16:39:56 compute-0 dnf[31456]: Extra Packages for Enterprise Linux 9 - x86_64   22 MB/s |  20 MB     00:00
Feb 27 16:40:09 compute-0 dnf[31456]: Metadata cache created.
Feb 27 16:40:09 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 27 16:40:09 compute-0 systemd[1]: Finished dnf makecache.
Feb 27 16:40:09 compute-0 systemd[1]: dnf-makecache.service: Consumed 23.313s CPU time.
Feb 27 16:41:58 compute-0 sshd-session[30537]: Received disconnect from 38.102.83.159 port 53310:11: disconnected by user
Feb 27 16:41:58 compute-0 sshd-session[30537]: Disconnected from user zuul 38.102.83.159 port 53310
Feb 27 16:41:58 compute-0 sshd-session[30534]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:41:58 compute-0 systemd-logind[803]: Session 6 logged out. Waiting for processes to exit.
Feb 27 16:41:58 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Feb 27 16:41:58 compute-0 systemd[1]: session-6.scope: Consumed 4.886s CPU time.
Feb 27 16:41:58 compute-0 systemd-logind[803]: Removed session 6.
Feb 27 16:43:11 compute-0 sshd-session[31561]: Connection closed by authenticating user root 158.51.96.38 port 41200 [preauth]
Feb 27 16:43:12 compute-0 sshd-session[31563]: Connection closed by authenticating user root 158.51.96.38 port 63898 [preauth]
Feb 27 16:43:12 compute-0 sshd-session[31565]: Connection closed by authenticating user root 158.51.96.38 port 63910 [preauth]
Feb 27 16:43:12 compute-0 sshd-session[31567]: Connection closed by authenticating user root 158.51.96.38 port 63916 [preauth]
Feb 27 16:43:13 compute-0 sshd-session[31569]: Connection closed by authenticating user root 158.51.96.38 port 63922 [preauth]
Feb 27 16:43:13 compute-0 sshd-session[31571]: Connection closed by authenticating user root 158.51.96.38 port 63932 [preauth]
Feb 27 16:43:14 compute-0 sshd-session[31573]: Connection closed by authenticating user root 158.51.96.38 port 63944 [preauth]
Feb 27 16:43:14 compute-0 sshd-session[31575]: Connection closed by authenticating user root 158.51.96.38 port 63948 [preauth]
Feb 27 16:43:15 compute-0 sshd-session[31577]: Connection closed by authenticating user root 158.51.96.38 port 63962 [preauth]
Feb 27 16:43:15 compute-0 sshd-session[31579]: Connection closed by authenticating user root 158.51.96.38 port 63974 [preauth]
Feb 27 16:43:16 compute-0 sshd-session[31581]: Connection closed by authenticating user root 158.51.96.38 port 63984 [preauth]
Feb 27 16:43:16 compute-0 sshd-session[31583]: Connection closed by authenticating user root 158.51.96.38 port 63990 [preauth]
Feb 27 16:43:17 compute-0 sshd-session[31585]: Connection closed by authenticating user root 158.51.96.38 port 64006 [preauth]
Feb 27 16:43:17 compute-0 sshd-session[31587]: Connection closed by authenticating user root 158.51.96.38 port 64008 [preauth]
Feb 27 16:43:18 compute-0 sshd-session[31589]: Connection closed by authenticating user root 158.51.96.38 port 64014 [preauth]
Feb 27 16:43:18 compute-0 sshd-session[31591]: Connection closed by authenticating user root 158.51.96.38 port 64024 [preauth]
Feb 27 16:43:19 compute-0 sshd-session[31593]: Connection closed by authenticating user root 158.51.96.38 port 64034 [preauth]
Feb 27 16:43:19 compute-0 sshd-session[31595]: Connection closed by authenticating user root 158.51.96.38 port 64040 [preauth]
Feb 27 16:43:20 compute-0 sshd-session[31597]: Connection closed by authenticating user root 158.51.96.38 port 64052 [preauth]
Feb 27 16:43:20 compute-0 sshd-session[31599]: Connection closed by authenticating user root 158.51.96.38 port 64054 [preauth]
Feb 27 16:43:22 compute-0 sshd-session[31601]: Connection closed by authenticating user root 158.51.96.38 port 64060 [preauth]
Feb 27 16:43:22 compute-0 sshd-session[31603]: Connection closed by authenticating user root 158.51.96.38 port 13680 [preauth]
Feb 27 16:43:23 compute-0 sshd-session[31605]: Connection closed by authenticating user root 158.51.96.38 port 13682 [preauth]
Feb 27 16:43:23 compute-0 sshd-session[31607]: Connection closed by authenticating user root 158.51.96.38 port 13686 [preauth]
Feb 27 16:43:24 compute-0 sshd-session[31609]: Connection closed by authenticating user root 158.51.96.38 port 13692 [preauth]
Feb 27 16:43:24 compute-0 sshd-session[31611]: Connection closed by authenticating user root 158.51.96.38 port 13696 [preauth]
Feb 27 16:43:25 compute-0 sshd-session[31613]: Connection closed by authenticating user root 158.51.96.38 port 13710 [preauth]
Feb 27 16:43:25 compute-0 sshd-session[31615]: Connection closed by authenticating user root 158.51.96.38 port 13712 [preauth]
Feb 27 16:43:25 compute-0 sshd-session[31617]: Connection closed by authenticating user root 158.51.96.38 port 13724 [preauth]
Feb 27 16:43:26 compute-0 sshd-session[31619]: Connection closed by authenticating user root 158.51.96.38 port 13734 [preauth]
Feb 27 16:43:26 compute-0 sshd-session[31621]: Connection closed by authenticating user root 158.51.96.38 port 13746 [preauth]
Feb 27 16:43:27 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13760 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:27 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13762 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:27 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13774 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:27 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13780 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:27 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13786 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:27 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13798 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:27 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13802 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:28 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13818 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:28 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13822 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:28 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13834 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:28 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13842 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:28 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13848 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:28 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13850 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:28 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13854 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:28 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13860 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:29 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13866 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:29 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13872 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:29 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13874 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:29 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13878 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:29 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13884 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:29 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13900 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:29 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13902 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:30 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13918 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:30 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13924 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:30 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13934 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:30 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13936 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:30 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13946 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:30 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:13956 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:31 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2170 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:31 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2182 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:31 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2190 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:31 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2204 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:31 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2214 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:31 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2230 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:31 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2246 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:32 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2252 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:32 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2266 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:32 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2270 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:32 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2284 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:32 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2286 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:32 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2302 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:32 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2308 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:33 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2322 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:33 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2338 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:33 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2344 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:33 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2348 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:33 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2358 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:33 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2372 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:34 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2388 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:34 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2398 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:34 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2402 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:34 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2416 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:34 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2422 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:34 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2426 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:34 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2438 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2446 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2450 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2464 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2478 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2486 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2498 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2506 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:36 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2512 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:36 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2518 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:36 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2528 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:36 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2542 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:36 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2556 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:36 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2558 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:36 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2574 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2576 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2584 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2590 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2606 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2612 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2616 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2620 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2630 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2642 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2646 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2648 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2654 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2656 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2670 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:39 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2678 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:39 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2684 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:39 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2688 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:39 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2690 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:39 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2706 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:39 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2722 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:40 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2734 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:40 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2742 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:40 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2756 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:40 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2762 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:40 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2774 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:40 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2790 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:40 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:2792 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:41 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21436 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:41 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21440 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:41 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21454 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:41 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21456 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:41 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21462 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:41 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21478 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:41 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21486 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:42 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21500 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:43 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21506 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:43 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21512 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:43 compute-0 sshd-session[31623]: Invalid user user from 158.51.96.38 port 21520
Feb 27 16:43:43 compute-0 sshd-session[31623]: Connection closed by invalid user user 158.51.96.38 port 21520 [preauth]
Feb 27 16:43:44 compute-0 sshd-session[31625]: Invalid user user from 158.51.96.38 port 21534
Feb 27 16:43:44 compute-0 sshd-session[31625]: Connection closed by invalid user user 158.51.96.38 port 21534 [preauth]
Feb 27 16:43:44 compute-0 sshd-session[31627]: Invalid user user from 158.51.96.38 port 21540
Feb 27 16:43:44 compute-0 sshd-session[31627]: Connection closed by invalid user user 158.51.96.38 port 21540 [preauth]
Feb 27 16:43:45 compute-0 sshd-session[31629]: Invalid user user from 158.51.96.38 port 21542
Feb 27 16:43:45 compute-0 sshd-session[31629]: Connection closed by invalid user user 158.51.96.38 port 21542 [preauth]
Feb 27 16:43:45 compute-0 sshd-session[31631]: Invalid user user from 158.51.96.38 port 21554
Feb 27 16:43:45 compute-0 sshd-session[31631]: Connection closed by invalid user user 158.51.96.38 port 21554 [preauth]
Feb 27 16:43:46 compute-0 sshd-session[31633]: Invalid user user from 158.51.96.38 port 21570
Feb 27 16:43:46 compute-0 sshd-session[31633]: Connection closed by invalid user user 158.51.96.38 port 21570 [preauth]
Feb 27 16:43:46 compute-0 sshd-session[31635]: Invalid user user from 158.51.96.38 port 21584
Feb 27 16:43:46 compute-0 sshd-session[31635]: Connection closed by invalid user user 158.51.96.38 port 21584 [preauth]
Feb 27 16:43:47 compute-0 sshd-session[31637]: Invalid user user from 158.51.96.38 port 21592
Feb 27 16:43:47 compute-0 sshd-session[31637]: Connection closed by invalid user user 158.51.96.38 port 21592 [preauth]
Feb 27 16:43:48 compute-0 sshd-session[31639]: Invalid user user from 158.51.96.38 port 21604
Feb 27 16:43:48 compute-0 sshd-session[31639]: Connection closed by invalid user user 158.51.96.38 port 21604 [preauth]
Feb 27 16:43:48 compute-0 sshd-session[31641]: Invalid user user from 158.51.96.38 port 21608
Feb 27 16:43:48 compute-0 sshd-session[31641]: Connection closed by invalid user user 158.51.96.38 port 21608 [preauth]
Feb 27 16:43:49 compute-0 sshd-session[31643]: Invalid user user from 158.51.96.38 port 21614
Feb 27 16:43:49 compute-0 sshd-session[31643]: Connection closed by invalid user user 158.51.96.38 port 21614 [preauth]
Feb 27 16:43:49 compute-0 sshd-session[31645]: Invalid user user from 158.51.96.38 port 21626
Feb 27 16:43:49 compute-0 sshd-session[31645]: Connection closed by invalid user user 158.51.96.38 port 21626 [preauth]
Feb 27 16:43:49 compute-0 sshd-session[31647]: Invalid user user from 158.51.96.38 port 21636
Feb 27 16:43:50 compute-0 sshd-session[31647]: Connection closed by invalid user user 158.51.96.38 port 21636 [preauth]
Feb 27 16:43:50 compute-0 sshd-session[31649]: Invalid user user from 158.51.96.38 port 21648
Feb 27 16:43:50 compute-0 sshd-session[31649]: Connection closed by invalid user user 158.51.96.38 port 21648 [preauth]
Feb 27 16:43:51 compute-0 sshd-session[31651]: Invalid user user from 158.51.96.38 port 21654
Feb 27 16:43:51 compute-0 sshd-session[31651]: Connection closed by invalid user user 158.51.96.38 port 21654 [preauth]
Feb 27 16:43:51 compute-0 sshd-session[31653]: Invalid user user from 158.51.96.38 port 36142
Feb 27 16:43:51 compute-0 sshd-session[31653]: Connection closed by invalid user user 158.51.96.38 port 36142 [preauth]
Feb 27 16:43:51 compute-0 sshd-session[31655]: Invalid user user from 158.51.96.38 port 36146
Feb 27 16:43:52 compute-0 sshd-session[31655]: Connection closed by invalid user user 158.51.96.38 port 36146 [preauth]
Feb 27 16:43:52 compute-0 sshd-session[31657]: Invalid user user from 158.51.96.38 port 36166
Feb 27 16:43:52 compute-0 sshd-session[31657]: Connection closed by invalid user user 158.51.96.38 port 36166 [preauth]
Feb 27 16:43:52 compute-0 sshd-session[31659]: Invalid user user from 158.51.96.38 port 36168
Feb 27 16:43:52 compute-0 sshd-session[31659]: Connection closed by invalid user user 158.51.96.38 port 36168 [preauth]
Feb 27 16:43:53 compute-0 sshd-session[31661]: Invalid user user from 158.51.96.38 port 36174
Feb 27 16:43:53 compute-0 sshd-session[31661]: Connection closed by invalid user user 158.51.96.38 port 36174 [preauth]
Feb 27 16:43:53 compute-0 sshd-session[31663]: Invalid user user from 158.51.96.38 port 36190
Feb 27 16:43:53 compute-0 sshd-session[31663]: Connection closed by invalid user user 158.51.96.38 port 36190 [preauth]
Feb 27 16:43:54 compute-0 sshd-session[31665]: Invalid user user from 158.51.96.38 port 36202
Feb 27 16:43:54 compute-0 sshd-session[31665]: Connection closed by invalid user user 158.51.96.38 port 36202 [preauth]
Feb 27 16:43:54 compute-0 sshd-session[31667]: Invalid user user from 158.51.96.38 port 36212
Feb 27 16:43:55 compute-0 sshd-session[31667]: Connection closed by invalid user user 158.51.96.38 port 36212 [preauth]
Feb 27 16:43:55 compute-0 sshd-session[31669]: Invalid user user from 158.51.96.38 port 36214
Feb 27 16:43:55 compute-0 sshd-session[31669]: Connection closed by invalid user user 158.51.96.38 port 36214 [preauth]
Feb 27 16:43:55 compute-0 sshd-session[31671]: Invalid user user from 158.51.96.38 port 36218
Feb 27 16:43:55 compute-0 sshd-session[31671]: Connection closed by invalid user user 158.51.96.38 port 36218 [preauth]
Feb 27 16:43:56 compute-0 sshd-session[31673]: Invalid user user from 158.51.96.38 port 36234
Feb 27 16:43:56 compute-0 sshd-session[31673]: Connection closed by invalid user user 158.51.96.38 port 36234 [preauth]
Feb 27 16:43:56 compute-0 sshd-session[31675]: Invalid user user from 158.51.96.38 port 36242
Feb 27 16:43:56 compute-0 sshd-session[31675]: Connection closed by invalid user user 158.51.96.38 port 36242 [preauth]
Feb 27 16:43:57 compute-0 sshd-session[31677]: Invalid user ubuntu from 158.51.96.38 port 36254
Feb 27 16:43:57 compute-0 sshd-session[31677]: Connection closed by invalid user ubuntu 158.51.96.38 port 36254 [preauth]
Feb 27 16:43:57 compute-0 sshd-session[31679]: Invalid user ubuntu from 158.51.96.38 port 36260
Feb 27 16:43:57 compute-0 sshd-session[31679]: Connection closed by invalid user ubuntu 158.51.96.38 port 36260 [preauth]
Feb 27 16:43:58 compute-0 sshd-session[31681]: Invalid user ubuntu from 158.51.96.38 port 36268
Feb 27 16:43:58 compute-0 sshd-session[31681]: Connection closed by invalid user ubuntu 158.51.96.38 port 36268 [preauth]
Feb 27 16:43:58 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36284 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:58 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36298 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:58 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36302 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:59 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36314 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:59 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36330 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:59 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36338 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:59 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36354 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:59 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36368 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:59 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36384 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:59 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36394 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:43:59 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36406 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:00 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36416 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:00 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36426 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:01 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:36438 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:01 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20716 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:01 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20728 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:01 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20744 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:01 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20752 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:02 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20760 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:02 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20764 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:02 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20780 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:02 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20784 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:02 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20792 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:02 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20806 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:02 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20810 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:03 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20824 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:03 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20830 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:03 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20838 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:03 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20844 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:03 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20848 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:03 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20862 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:03 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20868 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:04 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20878 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:04 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20892 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:04 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20898 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:04 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20912 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:04 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20914 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:04 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20926 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:05 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20942 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:05 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20958 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:05 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20972 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:05 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20980 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:05 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20982 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:05 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:20996 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:05 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21012 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:06 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21020 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:06 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21028 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:06 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21044 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:06 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21060 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:06 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21066 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:06 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21070 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:07 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21080 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:07 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21092 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:07 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21106 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:07 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21120 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:07 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21134 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:07 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21146 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:07 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21156 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:08 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21170 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:08 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21172 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:08 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21184 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:08 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21188 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:08 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21190 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:08 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21196 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:09 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21200 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:09 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21204 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:09 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21208 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:09 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21220 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:09 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21236 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:09 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21242 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:10 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21252 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:10 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21256 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:10 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21264 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:10 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21266 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:10 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21282 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:10 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21292 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:10 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:21298 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:11 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15800 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:11 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15808 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:11 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15812 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:11 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15820 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:11 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15832 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:11 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15838 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:11 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15848 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:12 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15860 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:12 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15862 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:12 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15876 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:12 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15884 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:12 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15900 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:12 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15902 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:13 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15910 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:13 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15922 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:13 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15930 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:13 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15936 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:13 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15938 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:13 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15954 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:14 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15956 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:14 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15972 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:14 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:15974 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:14 compute-0 sshd-session[31683]: Invalid user debian from 158.51.96.38 port 15984
Feb 27 16:44:14 compute-0 sshd-session[31683]: Connection closed by invalid user debian 158.51.96.38 port 15984 [preauth]
Feb 27 16:44:15 compute-0 sshd-session[31685]: Invalid user debian from 158.51.96.38 port 15992
Feb 27 16:44:15 compute-0 sshd-session[31685]: Connection closed by invalid user debian 158.51.96.38 port 15992 [preauth]
Feb 27 16:44:16 compute-0 sshd-session[31687]: Invalid user debian from 158.51.96.38 port 16006
Feb 27 16:44:16 compute-0 sshd-session[31687]: Connection closed by invalid user debian 158.51.96.38 port 16006 [preauth]
Feb 27 16:44:17 compute-0 sshd-session[31689]: Invalid user debian from 158.51.96.38 port 16016
Feb 27 16:44:17 compute-0 sshd-session[31689]: Connection closed by invalid user debian 158.51.96.38 port 16016 [preauth]
Feb 27 16:44:18 compute-0 sshd-session[31691]: Invalid user debian from 158.51.96.38 port 16018
Feb 27 16:44:18 compute-0 sshd-session[31691]: Connection closed by invalid user debian 158.51.96.38 port 16018 [preauth]
Feb 27 16:44:18 compute-0 sshd-session[31693]: Invalid user debian from 158.51.96.38 port 16020
Feb 27 16:44:18 compute-0 sshd-session[31693]: Connection closed by invalid user debian 158.51.96.38 port 16020 [preauth]
Feb 27 16:44:19 compute-0 sshd-session[31695]: Invalid user debian from 158.51.96.38 port 16036
Feb 27 16:44:19 compute-0 sshd-session[31695]: Connection closed by invalid user debian 158.51.96.38 port 16036 [preauth]
Feb 27 16:44:19 compute-0 sshd-session[31697]: Invalid user debian from 158.51.96.38 port 16050
Feb 27 16:44:19 compute-0 sshd-session[31697]: Connection closed by invalid user debian 158.51.96.38 port 16050 [preauth]
Feb 27 16:44:20 compute-0 sshd-session[31699]: Invalid user debian from 158.51.96.38 port 16056
Feb 27 16:44:20 compute-0 sshd-session[31699]: Connection closed by invalid user debian 158.51.96.38 port 16056 [preauth]
Feb 27 16:44:20 compute-0 sshd-session[31701]: Invalid user debian from 158.51.96.38 port 16072
Feb 27 16:44:20 compute-0 sshd-session[31701]: Connection closed by invalid user debian 158.51.96.38 port 16072 [preauth]
Feb 27 16:44:21 compute-0 sshd-session[31703]: Invalid user debian from 158.51.96.38 port 16076
Feb 27 16:44:21 compute-0 sshd-session[31703]: Connection closed by invalid user debian 158.51.96.38 port 16076 [preauth]
Feb 27 16:44:21 compute-0 sshd-session[31705]: Invalid user debian from 158.51.96.38 port 35872
Feb 27 16:44:21 compute-0 sshd-session[31705]: Connection closed by invalid user debian 158.51.96.38 port 35872 [preauth]
Feb 27 16:44:21 compute-0 sshd-session[31707]: Invalid user debian from 158.51.96.38 port 35876
Feb 27 16:44:21 compute-0 sshd-session[31707]: Connection closed by invalid user debian 158.51.96.38 port 35876 [preauth]
Feb 27 16:44:22 compute-0 sshd-session[31709]: Invalid user debian from 158.51.96.38 port 35884
Feb 27 16:44:22 compute-0 sshd-session[31709]: Connection closed by invalid user debian 158.51.96.38 port 35884 [preauth]
Feb 27 16:44:22 compute-0 sshd-session[31711]: Invalid user debian from 158.51.96.38 port 35894
Feb 27 16:44:22 compute-0 sshd-session[31711]: Connection closed by invalid user debian 158.51.96.38 port 35894 [preauth]
Feb 27 16:44:23 compute-0 sshd-session[31713]: Invalid user debian from 158.51.96.38 port 35902
Feb 27 16:44:23 compute-0 sshd-session[31713]: Connection closed by invalid user debian 158.51.96.38 port 35902 [preauth]
Feb 27 16:44:23 compute-0 sshd-session[31715]: Invalid user debian from 158.51.96.38 port 35908
Feb 27 16:44:23 compute-0 sshd-session[31715]: Connection closed by invalid user debian 158.51.96.38 port 35908 [preauth]
Feb 27 16:44:24 compute-0 sshd-session[31717]: Invalid user debian from 158.51.96.38 port 35910
Feb 27 16:44:24 compute-0 sshd-session[31717]: Connection closed by invalid user debian 158.51.96.38 port 35910 [preauth]
Feb 27 16:44:24 compute-0 sshd-session[31719]: Invalid user debian from 158.51.96.38 port 35918
Feb 27 16:44:24 compute-0 sshd-session[31719]: Connection closed by invalid user debian 158.51.96.38 port 35918 [preauth]
Feb 27 16:44:25 compute-0 sshd-session[31721]: Invalid user debian from 158.51.96.38 port 35932
Feb 27 16:44:25 compute-0 sshd-session[31721]: Connection closed by invalid user debian 158.51.96.38 port 35932 [preauth]
Feb 27 16:44:25 compute-0 sshd-session[31723]: Invalid user debian from 158.51.96.38 port 35936
Feb 27 16:44:25 compute-0 sshd-session[31723]: Connection closed by invalid user debian 158.51.96.38 port 35936 [preauth]
Feb 27 16:44:26 compute-0 sshd-session[31725]: Invalid user debian from 158.51.96.38 port 35944
Feb 27 16:44:26 compute-0 sshd-session[31725]: Connection closed by invalid user debian 158.51.96.38 port 35944 [preauth]
Feb 27 16:44:26 compute-0 sshd-session[31727]: Invalid user debian from 158.51.96.38 port 35960
Feb 27 16:44:26 compute-0 sshd-session[31727]: Connection closed by invalid user debian 158.51.96.38 port 35960 [preauth]
Feb 27 16:44:27 compute-0 sshd-session[31729]: Invalid user debian from 158.51.96.38 port 35972
Feb 27 16:44:27 compute-0 sshd-session[31729]: Connection closed by invalid user debian 158.51.96.38 port 35972 [preauth]
Feb 27 16:44:27 compute-0 sshd-session[31731]: Invalid user debian from 158.51.96.38 port 35974
Feb 27 16:44:27 compute-0 sshd-session[31731]: Connection closed by invalid user debian 158.51.96.38 port 35974 [preauth]
Feb 27 16:44:28 compute-0 sshd-session[31733]: Invalid user debian from 158.51.96.38 port 35986
Feb 27 16:44:28 compute-0 sshd-session[31733]: Connection closed by invalid user debian 158.51.96.38 port 35986 [preauth]
Feb 27 16:44:29 compute-0 sshd-session[31735]: Invalid user debian from 158.51.96.38 port 35992
Feb 27 16:44:29 compute-0 sshd-session[31735]: Connection closed by invalid user debian 158.51.96.38 port 35992 [preauth]
Feb 27 16:44:30 compute-0 sshd-session[31737]: Invalid user debian from 158.51.96.38 port 36004
Feb 27 16:44:31 compute-0 sshd-session[31737]: Connection closed by invalid user debian 158.51.96.38 port 36004 [preauth]
Feb 27 16:44:31 compute-0 sshd-session[31739]: Invalid user debian from 158.51.96.38 port 32786
Feb 27 16:44:31 compute-0 sshd-session[31739]: Connection closed by invalid user debian 158.51.96.38 port 32786 [preauth]
Feb 27 16:44:32 compute-0 sshd-session[31741]: Invalid user debian from 158.51.96.38 port 32794
Feb 27 16:44:32 compute-0 sshd-session[31741]: Connection closed by invalid user debian 158.51.96.38 port 32794 [preauth]
Feb 27 16:44:32 compute-0 sshd-session[31743]: Invalid user debian from 158.51.96.38 port 32804
Feb 27 16:44:32 compute-0 sshd-session[31743]: Connection closed by invalid user debian 158.51.96.38 port 32804 [preauth]
Feb 27 16:44:32 compute-0 sshd-session[31745]: Invalid user debian from 158.51.96.38 port 32808
Feb 27 16:44:33 compute-0 sshd-session[31745]: Connection closed by invalid user debian 158.51.96.38 port 32808 [preauth]
Feb 27 16:44:33 compute-0 sshd-session[31747]: Invalid user debian from 158.51.96.38 port 32812
Feb 27 16:44:33 compute-0 sshd-session[31747]: Connection closed by invalid user debian 158.51.96.38 port 32812 [preauth]
Feb 27 16:44:33 compute-0 sshd-session[31749]: Invalid user debian from 158.51.96.38 port 32826
Feb 27 16:44:33 compute-0 sshd-session[31749]: Connection closed by invalid user debian 158.51.96.38 port 32826 [preauth]
Feb 27 16:44:34 compute-0 sshd-session[31751]: Invalid user debian from 158.51.96.38 port 32842
Feb 27 16:44:34 compute-0 sshd-session[31751]: Connection closed by invalid user debian 158.51.96.38 port 32842 [preauth]
Feb 27 16:44:34 compute-0 sshd-session[31753]: Invalid user debian from 158.51.96.38 port 32850
Feb 27 16:44:35 compute-0 sshd-session[31753]: Connection closed by invalid user debian 158.51.96.38 port 32850 [preauth]
Feb 27 16:44:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32852 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32868 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32882 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32890 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32894 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:35 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32910 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:36 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32926 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:36 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32930 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32944 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32948 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32960 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32970 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:37 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32980 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:32992 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:33008 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:33022 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:33028 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:33038 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:33054 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:38 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:33056 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:39 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:33058 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:39 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:33070 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:39 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:33078 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:44:39 compute-0 sshd[1013]: drop connection #0 from [158.51.96.38]:33092 on [38.129.56.53]:22 penalty: connections without attempting authentication
Feb 27 16:48:01 compute-0 sshd-session[31757]: Accepted publickey for zuul from 192.168.122.30 port 36860 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:48:01 compute-0 systemd-logind[803]: New session 7 of user zuul.
Feb 27 16:48:01 compute-0 systemd[1]: Started Session 7 of User zuul.
Feb 27 16:48:01 compute-0 sshd-session[31757]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:48:02 compute-0 python3.9[31910]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:48:03 compute-0 sudo[32089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkipxjebivcmbyhvicevavuwkdbvvysq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772210883.4340053-27-155724939063924/AnsiballZ_command.py'
Feb 27 16:48:03 compute-0 sudo[32089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:48:04 compute-0 python3.9[32092]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:48:22 compute-0 sudo[32089]: pam_unix(sudo:session): session closed for user root
Feb 27 16:48:22 compute-0 sshd-session[31760]: Connection closed by 192.168.122.30 port 36860
Feb 27 16:48:22 compute-0 sshd-session[31757]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:48:22 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Feb 27 16:48:22 compute-0 systemd[1]: session-7.scope: Consumed 8.041s CPU time.
Feb 27 16:48:22 compute-0 systemd-logind[803]: Session 7 logged out. Waiting for processes to exit.
Feb 27 16:48:22 compute-0 systemd-logind[803]: Removed session 7.
Feb 27 16:48:28 compute-0 sshd-session[32152]: Accepted publickey for zuul from 192.168.122.30 port 45088 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:48:28 compute-0 systemd-logind[803]: New session 8 of user zuul.
Feb 27 16:48:28 compute-0 systemd[1]: Started Session 8 of User zuul.
Feb 27 16:48:28 compute-0 sshd-session[32152]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:48:29 compute-0 python3.9[32305]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:48:29 compute-0 sshd-session[32155]: Connection closed by 192.168.122.30 port 45088
Feb 27 16:48:29 compute-0 sshd-session[32152]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:48:29 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Feb 27 16:48:29 compute-0 systemd-logind[803]: Session 8 logged out. Waiting for processes to exit.
Feb 27 16:48:29 compute-0 systemd-logind[803]: Removed session 8.
Feb 27 16:48:45 compute-0 sshd-session[32334]: Accepted publickey for zuul from 192.168.122.30 port 41706 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:48:45 compute-0 systemd-logind[803]: New session 9 of user zuul.
Feb 27 16:48:45 compute-0 systemd[1]: Started Session 9 of User zuul.
Feb 27 16:48:45 compute-0 sshd-session[32334]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:48:46 compute-0 python3.9[32487]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 27 16:48:47 compute-0 python3.9[32661]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:48:48 compute-0 sudo[32811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hupukomupywormktrhikwaqovoqmnneu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772210927.653003-40-222692061746958/AnsiballZ_command.py'
Feb 27 16:48:48 compute-0 sudo[32811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:48:48 compute-0 python3.9[32814]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:48:48 compute-0 sudo[32811]: pam_unix(sudo:session): session closed for user root
Feb 27 16:48:49 compute-0 sudo[32965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hczsvobwdhdgmzjheghovxueeuypozys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772210928.897785-52-173100054744451/AnsiballZ_stat.py'
Feb 27 16:48:49 compute-0 sudo[32965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:48:49 compute-0 python3.9[32968]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:48:49 compute-0 sudo[32965]: pam_unix(sudo:session): session closed for user root
Feb 27 16:48:50 compute-0 sudo[33118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aurpvxnrcehcvbwtvmsdwtzliyhjirjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772210929.7474465-60-240009962139129/AnsiballZ_file.py'
Feb 27 16:48:50 compute-0 sudo[33118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:48:50 compute-0 python3.9[33121]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:48:50 compute-0 sudo[33118]: pam_unix(sudo:session): session closed for user root
Feb 27 16:48:50 compute-0 sudo[33271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qikiotebeaykotslklgttnomdubjpiij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772210930.6259823-68-215100531494712/AnsiballZ_stat.py'
Feb 27 16:48:50 compute-0 sudo[33271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:48:51 compute-0 python3.9[33274]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:48:51 compute-0 sudo[33271]: pam_unix(sudo:session): session closed for user root
Feb 27 16:48:51 compute-0 sudo[33395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qayacwoyxmbjzhlcqguokcfzymgwpnsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772210930.6259823-68-215100531494712/AnsiballZ_copy.py'
Feb 27 16:48:51 compute-0 sudo[33395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:48:51 compute-0 python3.9[33398]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1772210930.6259823-68-215100531494712/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:48:51 compute-0 sudo[33395]: pam_unix(sudo:session): session closed for user root
Feb 27 16:48:52 compute-0 sudo[33548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcxjsrxwfillcfkwowxmusecnjjyatgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772210932.1536758-83-127005427370340/AnsiballZ_setup.py'
Feb 27 16:48:52 compute-0 sudo[33548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:48:52 compute-0 python3.9[33551]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:48:52 compute-0 sudo[33548]: pam_unix(sudo:session): session closed for user root
Feb 27 16:48:53 compute-0 sudo[33705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxrcmfhhdzgbfzdxecpjybotqqgfswcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772210933.1208544-91-126748014899944/AnsiballZ_file.py'
Feb 27 16:48:53 compute-0 sudo[33705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:48:53 compute-0 python3.9[33708]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:48:53 compute-0 sudo[33705]: pam_unix(sudo:session): session closed for user root
Feb 27 16:48:54 compute-0 sudo[33858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywnfsfphumgfyfbhijgopyqtorucjvzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772210933.8397658-100-268687241074816/AnsiballZ_file.py'
Feb 27 16:48:54 compute-0 sudo[33858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:48:54 compute-0 python3.9[33861]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:48:54 compute-0 sudo[33858]: pam_unix(sudo:session): session closed for user root
Feb 27 16:48:55 compute-0 python3.9[34011]: ansible-ansible.builtin.service_facts Invoked
Feb 27 16:48:58 compute-0 python3.9[34265]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:48:58 compute-0 python3.9[34415]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:49:00 compute-0 python3.9[34569]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:49:00 compute-0 sudo[34725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqxiojvletbmqjbnxsgrgugdsklsjcpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772210940.5691419-148-94277399820729/AnsiballZ_setup.py'
Feb 27 16:49:00 compute-0 sudo[34725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:49:01 compute-0 python3.9[34728]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:49:01 compute-0 sudo[34725]: pam_unix(sudo:session): session closed for user root
Feb 27 16:49:01 compute-0 sudo[34810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtkcsvvukfddosvhfoliwzhbmnkjkbux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772210940.5691419-148-94277399820729/AnsiballZ_dnf.py'
Feb 27 16:49:01 compute-0 sudo[34810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:49:02 compute-0 python3.9[34813]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:49:46 compute-0 systemd[1]: Reloading.
Feb 27 16:49:46 compute-0 systemd-rc-local-generator[35001]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:49:46 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 27 16:49:47 compute-0 systemd[1]: Reloading.
Feb 27 16:49:47 compute-0 systemd-rc-local-generator[35057]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:49:47 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 27 16:49:47 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 27 16:49:47 compute-0 systemd[1]: Reloading.
Feb 27 16:49:47 compute-0 systemd-rc-local-generator[35104]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:49:47 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Feb 27 16:49:47 compute-0 dbus-broker-launch[779]: Noticed file-system modification, trigger reload.
Feb 27 16:49:47 compute-0 dbus-broker-launch[779]: Noticed file-system modification, trigger reload.
Feb 27 16:50:43 compute-0 kernel: SELinux:  Converting 2728 SID table entries...
Feb 27 16:50:44 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 27 16:50:44 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 27 16:50:44 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 27 16:50:44 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 27 16:50:44 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 27 16:50:44 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 27 16:50:44 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 27 16:50:44 compute-0 dbus-broker-launch[787]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 27 16:50:44 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 27 16:50:44 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 27 16:50:44 compute-0 systemd[1]: Reloading.
Feb 27 16:50:44 compute-0 systemd-rc-local-generator[35433]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:50:44 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 27 16:50:45 compute-0 sudo[34810]: pam_unix(sudo:session): session closed for user root
Feb 27 16:50:45 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 27 16:50:45 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 27 16:50:45 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.088s CPU time.
Feb 27 16:50:45 compute-0 systemd[1]: run-r8ec210e7596c45ce98e1a2b18b05d6c3.service: Deactivated successfully.
Feb 27 16:50:45 compute-0 sudo[36367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koxrbybfqcyyfqetdzgkpwlrgmoidzyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211045.2283177-160-176448990001646/AnsiballZ_command.py'
Feb 27 16:50:45 compute-0 sudo[36367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:50:45 compute-0 python3.9[36370]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:50:46 compute-0 sudo[36367]: pam_unix(sudo:session): session closed for user root
Feb 27 16:50:47 compute-0 sudo[36649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gchpczhwfowidabdcektnpuwmizmhjsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211047.065647-168-245075140069092/AnsiballZ_selinux.py'
Feb 27 16:50:47 compute-0 sudo[36649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:50:47 compute-0 python3.9[36652]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 27 16:50:47 compute-0 sudo[36649]: pam_unix(sudo:session): session closed for user root
Feb 27 16:50:48 compute-0 sudo[36802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juiwcluibqtieuabqarfmqcozdzzrdsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211048.3063478-179-215597923930536/AnsiballZ_command.py'
Feb 27 16:50:48 compute-0 sudo[36802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:50:49 compute-0 python3.9[36805]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 27 16:50:49 compute-0 sudo[36802]: pam_unix(sudo:session): session closed for user root
Feb 27 16:50:50 compute-0 sudo[36957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueqeouloeyikqcerpotbhdofwkbiyieb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211049.7910023-187-36559063353156/AnsiballZ_file.py'
Feb 27 16:50:50 compute-0 sudo[36957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:50:51 compute-0 python3.9[36960]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:50:51 compute-0 sudo[36957]: pam_unix(sudo:session): session closed for user root
Feb 27 16:50:52 compute-0 sudo[37110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqdeuibdozubcclqnujcinfikbkhaphh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211052.0634055-195-45055389043074/AnsiballZ_mount.py'
Feb 27 16:50:52 compute-0 sudo[37110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:50:52 compute-0 python3.9[37113]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 27 16:50:52 compute-0 sudo[37110]: pam_unix(sudo:session): session closed for user root
Feb 27 16:50:53 compute-0 sudo[37263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwrdpuromthjescsbgbvthvulkrmcgzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211053.495327-223-164201171765698/AnsiballZ_file.py'
Feb 27 16:50:53 compute-0 sudo[37263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:50:54 compute-0 python3.9[37266]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:50:54 compute-0 sudo[37263]: pam_unix(sudo:session): session closed for user root
Feb 27 16:50:54 compute-0 sudo[37416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvuijpupimmrvbqhtynhgtxrvgckvpbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211054.2124848-231-76882197491810/AnsiballZ_stat.py'
Feb 27 16:50:54 compute-0 sudo[37416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:50:54 compute-0 python3.9[37419]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:50:54 compute-0 sudo[37416]: pam_unix(sudo:session): session closed for user root
Feb 27 16:50:55 compute-0 sudo[37540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzfqwcwutwnaxvldapnypbqlhmqocbic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211054.2124848-231-76882197491810/AnsiballZ_copy.py'
Feb 27 16:50:55 compute-0 sudo[37540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:50:57 compute-0 sshd-session[37544]: Connection closed by 77.90.185.16 port 65105
Feb 27 16:50:58 compute-0 python3.9[37543]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211054.2124848-231-76882197491810/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0b0dce1c63b93697635da4e15a27b572fef59c62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:50:58 compute-0 sudo[37540]: pam_unix(sudo:session): session closed for user root
Feb 27 16:50:59 compute-0 sudo[37694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufwbmlkmsjqarxukbpiyrmxxyrruojqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211059.1792161-255-10313625193178/AnsiballZ_stat.py'
Feb 27 16:50:59 compute-0 sudo[37694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:50:59 compute-0 python3.9[37697]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:50:59 compute-0 sudo[37694]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:00 compute-0 sudo[37847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ainoudyppdnlmpaeyydxlinmdwpyvtak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211059.844242-263-170352833181596/AnsiballZ_command.py'
Feb 27 16:51:00 compute-0 sudo[37847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:00 compute-0 python3.9[37850]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:51:00 compute-0 sudo[37847]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:00 compute-0 sudo[38001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdzfflvzquvyjsvgluppjbfwocxcfhfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211060.6074915-271-156253358446546/AnsiballZ_file.py'
Feb 27 16:51:00 compute-0 sudo[38001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:01 compute-0 python3.9[38004]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:51:01 compute-0 sudo[38001]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:01 compute-0 sudo[38154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akteteduvgnchjtnmhllxtxtrujpnmnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211061.5459194-282-192351301650718/AnsiballZ_getent.py'
Feb 27 16:51:01 compute-0 sudo[38154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:02 compute-0 python3.9[38157]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 27 16:51:02 compute-0 sudo[38154]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:02 compute-0 rsyslogd[1012]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 27 16:51:02 compute-0 sudo[38309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-marvmrjaujpsozwjmpxuimpdgxiffsmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211062.3885245-290-277992180221147/AnsiballZ_group.py'
Feb 27 16:51:02 compute-0 sudo[38309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:03 compute-0 python3.9[38312]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 27 16:51:03 compute-0 groupadd[38313]: group added to /etc/group: name=qemu, GID=107
Feb 27 16:51:03 compute-0 groupadd[38313]: group added to /etc/gshadow: name=qemu
Feb 27 16:51:03 compute-0 groupadd[38313]: new group: name=qemu, GID=107
Feb 27 16:51:03 compute-0 sudo[38309]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:03 compute-0 sudo[38468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-autrclaadcyrmjluguqlqjcvqitkbcsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211063.3003929-298-186697646764988/AnsiballZ_user.py'
Feb 27 16:51:03 compute-0 sudo[38468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:04 compute-0 python3.9[38471]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 27 16:51:04 compute-0 useradd[38473]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/1
Feb 27 16:51:04 compute-0 sudo[38468]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:04 compute-0 sudo[38629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnwwunzaucpnwfexdyymunbocwawftfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211064.3096628-306-7331629343824/AnsiballZ_getent.py'
Feb 27 16:51:04 compute-0 sudo[38629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:04 compute-0 python3.9[38632]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 27 16:51:04 compute-0 sudo[38629]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:05 compute-0 sudo[38783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orektjnxodzkrlurhysaecnrcxnzrsdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211064.9873593-314-277631526253026/AnsiballZ_group.py'
Feb 27 16:51:05 compute-0 sudo[38783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:05 compute-0 python3.9[38786]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 27 16:51:05 compute-0 groupadd[38787]: group added to /etc/group: name=hugetlbfs, GID=42477
Feb 27 16:51:05 compute-0 groupadd[38787]: group added to /etc/gshadow: name=hugetlbfs
Feb 27 16:51:05 compute-0 groupadd[38787]: new group: name=hugetlbfs, GID=42477
Feb 27 16:51:05 compute-0 sudo[38783]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:06 compute-0 sudo[38942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pisciycurayiefipkqkirtmuzwgclbdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211065.7998013-323-179229270205841/AnsiballZ_file.py'
Feb 27 16:51:06 compute-0 sudo[38942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:06 compute-0 python3.9[38945]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 27 16:51:06 compute-0 sudo[38942]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:06 compute-0 sudo[39095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqlsnnmfznkszqdxkhfejirqacnbdweb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211066.6633348-334-207615917377968/AnsiballZ_dnf.py'
Feb 27 16:51:06 compute-0 sudo[39095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:07 compute-0 python3.9[39098]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:51:08 compute-0 sudo[39095]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:09 compute-0 sudo[39249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loiryxyokblmtqnuuydxyzafaetjsswj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211068.8082871-342-19309745176429/AnsiballZ_file.py'
Feb 27 16:51:09 compute-0 sudo[39249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:09 compute-0 python3.9[39252]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:51:09 compute-0 sudo[39249]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:09 compute-0 sudo[39402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydcpwumpwbrjwpirklxxobhefmbmtgvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211069.4963222-350-104286879272487/AnsiballZ_stat.py'
Feb 27 16:51:09 compute-0 sudo[39402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:09 compute-0 python3.9[39405]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:51:09 compute-0 sudo[39402]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:10 compute-0 sudo[39526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsfpqrxbgnagmselcvbvuqoxfzofccue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211069.4963222-350-104286879272487/AnsiballZ_copy.py'
Feb 27 16:51:10 compute-0 sudo[39526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:10 compute-0 python3.9[39529]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211069.4963222-350-104286879272487/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:51:10 compute-0 sudo[39526]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:11 compute-0 sudo[39679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhaifxcvilyjswnivxrlquzihdjitulv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211070.77481-365-70358485316538/AnsiballZ_systemd.py'
Feb 27 16:51:11 compute-0 sudo[39679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:11 compute-0 python3.9[39682]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 16:51:11 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 27 16:51:11 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 27 16:51:11 compute-0 kernel: Bridge firewalling registered
Feb 27 16:51:11 compute-0 systemd-modules-load[39686]: Inserted module 'br_netfilter'
Feb 27 16:51:11 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 27 16:51:11 compute-0 sudo[39679]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:12 compute-0 sudo[39840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-engqzucvieishlguwrpvoiggsbampleb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211072.0025818-373-220981941954093/AnsiballZ_stat.py'
Feb 27 16:51:12 compute-0 sudo[39840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:12 compute-0 python3.9[39843]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:51:12 compute-0 sudo[39840]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:12 compute-0 sudo[39964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-neppqrrwlrdaursnxtxuccikueikvlhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211072.0025818-373-220981941954093/AnsiballZ_copy.py'
Feb 27 16:51:12 compute-0 sudo[39964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:13 compute-0 python3.9[39967]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211072.0025818-373-220981941954093/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:51:13 compute-0 sudo[39964]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:13 compute-0 sudo[40117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xstxjmdqlufmwfhcmkntookffuzrkgfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211073.5245166-391-244999967367601/AnsiballZ_dnf.py'
Feb 27 16:51:13 compute-0 sudo[40117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:14 compute-0 python3.9[40120]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:51:17 compute-0 dbus-broker-launch[779]: Noticed file-system modification, trigger reload.
Feb 27 16:51:17 compute-0 dbus-broker-launch[779]: Noticed file-system modification, trigger reload.
Feb 27 16:51:17 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 27 16:51:17 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 27 16:51:17 compute-0 systemd[1]: Reloading.
Feb 27 16:51:17 compute-0 systemd-rc-local-generator[40180]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:51:17 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 27 16:51:18 compute-0 sudo[40117]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:18 compute-0 python3.9[41744]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:51:19 compute-0 python3.9[42923]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 27 16:51:20 compute-0 python3.9[43817]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:51:20 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 27 16:51:20 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 27 16:51:20 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.367s CPU time.
Feb 27 16:51:20 compute-0 systemd[1]: run-r8c3daa4e61054e29849a20240a8ffa8f.service: Deactivated successfully.
Feb 27 16:51:21 compute-0 sudo[44348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaeqayfrwwzdvblpuazqxatonzlavjaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211080.7937138-430-112755364850096/AnsiballZ_command.py'
Feb 27 16:51:21 compute-0 sudo[44348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:21 compute-0 python3.9[44351]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:51:21 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 27 16:51:21 compute-0 systemd[1]: Starting Authorization Manager...
Feb 27 16:51:21 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 27 16:51:21 compute-0 polkitd[44568]: Started polkitd version 0.117
Feb 27 16:51:21 compute-0 polkitd[44568]: Loading rules from directory /etc/polkit-1/rules.d
Feb 27 16:51:21 compute-0 polkitd[44568]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 27 16:51:21 compute-0 polkitd[44568]: Finished loading, compiling and executing 2 rules
Feb 27 16:51:21 compute-0 systemd[1]: Started Authorization Manager.
Feb 27 16:51:21 compute-0 polkitd[44568]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 27 16:51:22 compute-0 sudo[44348]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:22 compute-0 sudo[44736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnvqjhpjvjlawayadlcddpsqdxswydmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211082.23536-439-3868438846751/AnsiballZ_systemd.py'
Feb 27 16:51:22 compute-0 sudo[44736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:22 compute-0 python3.9[44739]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:51:22 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 27 16:51:22 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Feb 27 16:51:22 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 27 16:51:22 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 27 16:51:23 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Feb 27 16:51:23 compute-0 sudo[44736]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:23 compute-0 python3.9[44900]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 27 16:51:26 compute-0 sudo[45050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzadzispjxwfuoumsuqmsiraxtevxnny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211085.8681803-496-12246032262815/AnsiballZ_systemd.py'
Feb 27 16:51:26 compute-0 sudo[45050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:26 compute-0 python3.9[45053]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:51:26 compute-0 systemd[1]: Reloading.
Feb 27 16:51:26 compute-0 systemd-rc-local-generator[45077]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:51:26 compute-0 sudo[45050]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:27 compute-0 sudo[45247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldjytchcywomssrdyvvihosaixnmdkzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211086.7957222-496-40937197931724/AnsiballZ_systemd.py'
Feb 27 16:51:27 compute-0 sudo[45247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:27 compute-0 python3.9[45250]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:51:27 compute-0 systemd[1]: Reloading.
Feb 27 16:51:27 compute-0 systemd-rc-local-generator[45278]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:51:27 compute-0 sudo[45247]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:28 compute-0 sudo[45445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctvlnojanzelertrujxddaxpzrksgnrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211087.7873642-512-44344464218240/AnsiballZ_command.py'
Feb 27 16:51:28 compute-0 sudo[45445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:28 compute-0 python3.9[45448]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:51:28 compute-0 sudo[45445]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:28 compute-0 sudo[45599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eccdymagiiaauytvxcfbbcfjhdyipjgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211088.439252-520-119407162373629/AnsiballZ_command.py'
Feb 27 16:51:28 compute-0 sudo[45599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:28 compute-0 python3.9[45602]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:51:28 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Feb 27 16:51:28 compute-0 sudo[45599]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:29 compute-0 sudo[45753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xiggzfjdhngfyysjlveoiwvyhxiteizs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211089.033188-528-184853490009222/AnsiballZ_command.py'
Feb 27 16:51:29 compute-0 sudo[45753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:29 compute-0 python3.9[45756]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:51:30 compute-0 sudo[45753]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:31 compute-0 sudo[45916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfndcyjhyzefwmpxnsgkvrfayarvlsht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211091.183681-536-172531841053381/AnsiballZ_command.py'
Feb 27 16:51:31 compute-0 sudo[45916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:31 compute-0 python3.9[45919]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:51:31 compute-0 sudo[45916]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:32 compute-0 sudo[46070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjvvqrddwftybkxzriunaetoyagjrbxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211091.8587246-544-155047420894068/AnsiballZ_systemd.py'
Feb 27 16:51:32 compute-0 sudo[46070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:32 compute-0 python3.9[46073]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 16:51:33 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 27 16:51:33 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Feb 27 16:51:33 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Feb 27 16:51:33 compute-0 systemd[1]: Starting Apply Kernel Variables...
Feb 27 16:51:33 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 27 16:51:33 compute-0 systemd[1]: Finished Apply Kernel Variables.
Feb 27 16:51:33 compute-0 sudo[46070]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:34 compute-0 sshd-session[32337]: Connection closed by 192.168.122.30 port 41706
Feb 27 16:51:34 compute-0 sshd-session[32334]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:51:34 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Feb 27 16:51:34 compute-0 systemd[1]: session-9.scope: Consumed 2min 1.891s CPU time.
Feb 27 16:51:34 compute-0 systemd-logind[803]: Session 9 logged out. Waiting for processes to exit.
Feb 27 16:51:34 compute-0 systemd-logind[803]: Removed session 9.
Feb 27 16:51:39 compute-0 sshd-session[46103]: Accepted publickey for zuul from 192.168.122.30 port 49082 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:51:39 compute-0 systemd-logind[803]: New session 10 of user zuul.
Feb 27 16:51:39 compute-0 systemd[1]: Started Session 10 of User zuul.
Feb 27 16:51:39 compute-0 sshd-session[46103]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:51:40 compute-0 python3.9[46256]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:51:41 compute-0 python3.9[46410]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:51:42 compute-0 sudo[46564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sesaisyyddzvmmyoqxpzjkruibudqlrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211102.1610806-45-88225788224407/AnsiballZ_command.py'
Feb 27 16:51:42 compute-0 sudo[46564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:42 compute-0 python3.9[46567]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:51:42 compute-0 sudo[46564]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:43 compute-0 python3.9[46718]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:51:44 compute-0 sudo[46872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjnxmkhdlapawkdeuoegqomhuskgmaak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211104.1788993-65-197294723051065/AnsiballZ_setup.py'
Feb 27 16:51:44 compute-0 sudo[46872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:44 compute-0 python3.9[46875]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:51:45 compute-0 sudo[46872]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:45 compute-0 sudo[46957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igkzkfpexyubeisahfzrqifuinvsjgkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211104.1788993-65-197294723051065/AnsiballZ_dnf.py'
Feb 27 16:51:45 compute-0 sudo[46957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:46 compute-0 python3.9[46960]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:51:47 compute-0 sudo[46957]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:47 compute-0 sudo[47111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brznolynwmfowoqznfzfparxadgufzac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211107.5320892-77-147980808742885/AnsiballZ_setup.py'
Feb 27 16:51:47 compute-0 sudo[47111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:48 compute-0 python3.9[47114]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:51:48 compute-0 sudo[47111]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:49 compute-0 sudo[47283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pksvvkpknlxzuqdilyylytuutpjyuvzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211108.599827-88-185313182973154/AnsiballZ_file.py'
Feb 27 16:51:49 compute-0 sudo[47283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:49 compute-0 python3.9[47286]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:51:49 compute-0 sudo[47283]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:49 compute-0 sudo[47436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhdkkgbvbehbivcuyjktdrwbforqkdve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211109.4684463-96-34502670246943/AnsiballZ_command.py'
Feb 27 16:51:49 compute-0 sudo[47436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:49 compute-0 python3.9[47439]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:51:50 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2643618200-merged.mount: Deactivated successfully.
Feb 27 16:51:50 compute-0 podman[47440]: 2026-02-27 16:51:50.038134964 +0000 UTC m=+0.054176218 system refresh
Feb 27 16:51:50 compute-0 sudo[47436]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:50 compute-0 sudo[47600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvnnosgeonqyiyoosvrqemmfgqlueohh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211110.2621107-104-84695016949045/AnsiballZ_stat.py'
Feb 27 16:51:50 compute-0 sudo[47600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:50 compute-0 python3.9[47603]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:51:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:51:51 compute-0 sudo[47600]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:51 compute-0 sudo[47724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtuydnaxryjsyzywgiwmfqygbjlqylag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211110.2621107-104-84695016949045/AnsiballZ_copy.py'
Feb 27 16:51:51 compute-0 sudo[47724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:51 compute-0 python3.9[47727]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211110.2621107-104-84695016949045/.source.json follow=False _original_basename=podman_network_config.j2 checksum=5bef7f0bf9e8ab2ad548bd6e1cacf5b2db5dc731 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:51:51 compute-0 sudo[47724]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:52 compute-0 sudo[47877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzakzemyuzdrrcoctosguynihbufhhtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211111.9348226-119-71056426579110/AnsiballZ_stat.py'
Feb 27 16:51:52 compute-0 sudo[47877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:52 compute-0 python3.9[47880]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:51:52 compute-0 sudo[47877]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:52 compute-0 sudo[48001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfllcoyhquzvsktmlkqnieiyznvtvhav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211111.9348226-119-71056426579110/AnsiballZ_copy.py'
Feb 27 16:51:52 compute-0 sudo[48001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:53 compute-0 python3.9[48004]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211111.9348226-119-71056426579110/.source.conf follow=False _original_basename=registries.conf.j2 checksum=5e56947af0346c1263217349f098d20e53826a09 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:51:53 compute-0 sudo[48001]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:53 compute-0 sudo[48154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyofcgfadzflxaorohyzbxxjriekidvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211113.263523-135-17102347359969/AnsiballZ_ini_file.py'
Feb 27 16:51:53 compute-0 sudo[48154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:53 compute-0 python3.9[48157]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:51:53 compute-0 sudo[48154]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:54 compute-0 sudo[48307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwcdjyblcmahomgjipptqnignoioulhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211114.050197-135-266756410720845/AnsiballZ_ini_file.py'
Feb 27 16:51:54 compute-0 sudo[48307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:54 compute-0 python3.9[48310]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:51:54 compute-0 sudo[48307]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:54 compute-0 sudo[48460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpyglaeptuiwquiyskumrmczrgsqlgsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211114.6897426-135-69750888453207/AnsiballZ_ini_file.py'
Feb 27 16:51:54 compute-0 sudo[48460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:55 compute-0 python3.9[48463]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:51:55 compute-0 sudo[48460]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:55 compute-0 sudo[48613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdyslcxmuwrxpabckezcvdemkvmsluuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211115.3020787-135-107315048326323/AnsiballZ_ini_file.py'
Feb 27 16:51:55 compute-0 sudo[48613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:55 compute-0 python3.9[48616]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:51:55 compute-0 sudo[48613]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:56 compute-0 python3.9[48766]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:51:57 compute-0 sudo[48918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mszelljyzvphilmkwyffiqextfdovygy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211117.0684824-175-271481550051372/AnsiballZ_dnf.py'
Feb 27 16:51:57 compute-0 sudo[48918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:57 compute-0 python3.9[48921]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:51:58 compute-0 sudo[48918]: pam_unix(sudo:session): session closed for user root
Feb 27 16:51:59 compute-0 sudo[49072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkqlccrypwpnsonwbmjsjfjgydidgvic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211118.8891509-183-79764996120748/AnsiballZ_dnf.py'
Feb 27 16:51:59 compute-0 sudo[49072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:51:59 compute-0 python3.9[49075]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:52:00 compute-0 sudo[49072]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:01 compute-0 sudo[49234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywurjnbqlpicwnoxhfucvttlwloarwfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211121.0851874-193-248809174686970/AnsiballZ_dnf.py'
Feb 27 16:52:01 compute-0 sudo[49234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:01 compute-0 python3.9[49237]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:52:02 compute-0 sudo[49234]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:03 compute-0 sudo[49388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxiwgvqqdqdeybqgnaywqnshbcbnxauq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211122.9214406-202-236246367829516/AnsiballZ_dnf.py'
Feb 27 16:52:03 compute-0 sudo[49388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:03 compute-0 python3.9[49391]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:52:04 compute-0 sudo[49388]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:05 compute-0 sudo[49542]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toenlnbruvrmuxsqorvnoavdrckwxuwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211124.8954995-213-196741863543939/AnsiballZ_dnf.py'
Feb 27 16:52:05 compute-0 sudo[49542]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:05 compute-0 python3.9[49545]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:52:07 compute-0 sudo[49542]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:07 compute-0 sudo[49699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkmyrstwktenralflfqmfbugklwsejkr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211127.210167-221-207768239048634/AnsiballZ_dnf.py'
Feb 27 16:52:07 compute-0 sudo[49699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:07 compute-0 python3.9[49702]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:52:11 compute-0 sudo[49699]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:11 compute-0 sudo[49869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdteqdbawzfcremopuyptxooiddcldjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211131.470496-230-166131064662686/AnsiballZ_dnf.py'
Feb 27 16:52:11 compute-0 sudo[49869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:11 compute-0 python3.9[49872]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:52:13 compute-0 sudo[49869]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:13 compute-0 sudo[50023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcmqfgqnbbkcyaochduubfkdiouyfuvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211133.42716-239-5219069859934/AnsiballZ_dnf.py'
Feb 27 16:52:13 compute-0 sudo[50023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:13 compute-0 python3.9[50026]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:52:28 compute-0 sudo[50023]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:28 compute-0 sudo[50361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssntaudlusclzsdaqrordpxmrbunuwaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211148.457928-248-271462988162821/AnsiballZ_dnf.py'
Feb 27 16:52:28 compute-0 sudo[50361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:28 compute-0 python3.9[50364]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:52:30 compute-0 sudo[50361]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:30 compute-0 sudo[50518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuwjfgzwulktasishkdluyeyjakyfsma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211150.4667304-258-226653729936282/AnsiballZ_dnf.py'
Feb 27 16:52:30 compute-0 sudo[50518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:31 compute-0 python3.9[50521]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:52:33 compute-0 sudo[50518]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:34 compute-0 sudo[50676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrvknqdfnmykevtiwmvsijajcsrwgvbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211154.3015716-269-212434204823321/AnsiballZ_file.py'
Feb 27 16:52:34 compute-0 sudo[50676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:34 compute-0 python3.9[50679]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:52:34 compute-0 sudo[50676]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:35 compute-0 sudo[50852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlpskapzojljzmuzrkzukrexkxretvvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211154.9640334-277-74062140051260/AnsiballZ_stat.py'
Feb 27 16:52:35 compute-0 sudo[50852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:35 compute-0 python3.9[50855]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:52:35 compute-0 sudo[50852]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:35 compute-0 sudo[50976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aivuvhwpjyrhiagvntswakpqgtxzjwbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211154.9640334-277-74062140051260/AnsiballZ_copy.py'
Feb 27 16:52:35 compute-0 sudo[50976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:35 compute-0 python3.9[50979]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1772211154.9640334-277-74062140051260/.source.json _original_basename=.4dewmmbl follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:52:35 compute-0 sudo[50976]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:36 compute-0 sudo[51129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrojvhqtuuskatqdxtdongfokattkrij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211156.2703109-295-114902500775218/AnsiballZ_podman_image.py'
Feb 27 16:52:36 compute-0 sudo[51129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:36 compute-0 python3.9[51132]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 27 16:52:37 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:52:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat967319131-lower\x2dmapped.mount: Deactivated successfully.
Feb 27 16:52:41 compute-0 podman[51144]: 2026-02-27 16:52:41.922952764 +0000 UTC m=+4.873768333 image pull ce6781f051bf092c13d84cb587c56ad7edaa58b70fcc0effc1dff15724d5232e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 27 16:52:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:52:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:52:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:52:42 compute-0 sudo[51129]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:42 compute-0 sudo[51436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmtoiujafxuyhuqwxkushheafxgyzkpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211162.350218-306-176468630859969/AnsiballZ_podman_image.py'
Feb 27 16:52:42 compute-0 sudo[51436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:42 compute-0 python3.9[51439]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 27 16:52:51 compute-0 podman[51451]: 2026-02-27 16:52:51.112042183 +0000 UTC m=+8.238644485 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 16:52:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:52:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:52:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:52:51 compute-0 sudo[51436]: pam_unix(sudo:session): session closed for user root
Feb 27 16:52:51 compute-0 sudo[51767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxviulkctishgryspvccmfjvyefscalm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211171.6777716-316-238424005383018/AnsiballZ_podman_image.py'
Feb 27 16:52:51 compute-0 sudo[51767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:52:52 compute-0 python3.9[51770]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 27 16:52:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:53:03 compute-0 podman[51783]: 2026-02-27 16:53:03.517507866 +0000 UTC m=+11.275513999 image pull 7e637240710437807d86f704ec92f4417e40d6b1f76088848cab504c91655fe5 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 27 16:53:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:53:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:53:03 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:53:03 compute-0 sudo[51767]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:04 compute-0 sudo[52042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imaedzwfdguqhwbwiujitlsrfehrxahx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211184.1961112-327-82763807129849/AnsiballZ_podman_image.py'
Feb 27 16:53:04 compute-0 sudo[52042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:04 compute-0 python3.9[52045]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 27 16:53:04 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:53:09 compute-0 podman[52058]: 2026-02-27 16:53:09.888187265 +0000 UTC m=+5.085468080 image pull 3afa173a26fa8128cbac14b6c3e676d8aa6fde1ace8c482832813e31985446eb quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 27 16:53:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:53:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:53:09 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:53:10 compute-0 sudo[52042]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:10 compute-0 sudo[52316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-satbldewsbcofqimgociyetiqjsyvqwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211190.1807716-327-266306970066054/AnsiballZ_podman_image.py'
Feb 27 16:53:10 compute-0 sudo[52316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:10 compute-0 python3.9[52319]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 27 16:53:11 compute-0 podman[52332]: 2026-02-27 16:53:11.618690347 +0000 UTC m=+0.947001840 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Feb 27 16:53:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:53:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:53:11 compute-0 sudo[52316]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:53:12 compute-0 sshd-session[46106]: Connection closed by 192.168.122.30 port 49082
Feb 27 16:53:12 compute-0 sshd-session[46103]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:53:12 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Feb 27 16:53:12 compute-0 systemd[1]: session-10.scope: Consumed 1min 36.202s CPU time.
Feb 27 16:53:12 compute-0 systemd-logind[803]: Session 10 logged out. Waiting for processes to exit.
Feb 27 16:53:12 compute-0 systemd-logind[803]: Removed session 10.
Feb 27 16:53:17 compute-0 sshd-session[52482]: Accepted publickey for zuul from 192.168.122.30 port 44916 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:53:17 compute-0 systemd-logind[803]: New session 11 of user zuul.
Feb 27 16:53:17 compute-0 systemd[1]: Started Session 11 of User zuul.
Feb 27 16:53:17 compute-0 sshd-session[52482]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:53:18 compute-0 python3.9[52635]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:53:19 compute-0 sudo[52789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihupezosiurgrhpsrlqjyjzqlceielvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211199.3370943-32-107123331933582/AnsiballZ_getent.py'
Feb 27 16:53:19 compute-0 sudo[52789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:19 compute-0 python3.9[52792]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 27 16:53:19 compute-0 sudo[52789]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:20 compute-0 sudo[52943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weflupdbeavgaqguiyzdqvwrjhpnqqpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211200.1443286-40-33526498679640/AnsiballZ_group.py'
Feb 27 16:53:20 compute-0 sudo[52943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:20 compute-0 python3.9[52946]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 27 16:53:20 compute-0 groupadd[52947]: group added to /etc/group: name=openvswitch, GID=42476
Feb 27 16:53:20 compute-0 groupadd[52947]: group added to /etc/gshadow: name=openvswitch
Feb 27 16:53:20 compute-0 groupadd[52947]: new group: name=openvswitch, GID=42476
Feb 27 16:53:20 compute-0 sudo[52943]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:21 compute-0 sudo[53102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zoramnwpwlxkgqipewqrtruphopgbjof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211201.0555892-48-9707055023683/AnsiballZ_user.py'
Feb 27 16:53:21 compute-0 sudo[53102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:21 compute-0 python3.9[53105]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 27 16:53:21 compute-0 useradd[53107]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/1
Feb 27 16:53:21 compute-0 useradd[53107]: add 'openvswitch' to group 'hugetlbfs'
Feb 27 16:53:21 compute-0 useradd[53107]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 27 16:53:21 compute-0 sudo[53102]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:22 compute-0 sudo[53263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tywqqeickqqnfgcwpqkzkbtwafduvvvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211202.161416-58-192498307765468/AnsiballZ_setup.py'
Feb 27 16:53:22 compute-0 sudo[53263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:22 compute-0 python3.9[53266]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:53:22 compute-0 sudo[53263]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:23 compute-0 sudo[53348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqagopjlhahcfpxeelostbiqyswxcotl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211202.161416-58-192498307765468/AnsiballZ_dnf.py'
Feb 27 16:53:23 compute-0 sudo[53348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:23 compute-0 python3.9[53351]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:53:24 compute-0 sudo[53348]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:25 compute-0 sudo[53511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pezdsjpxzsbrlnrbmznuuetzlvfydvwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211205.2809882-72-4550622234061/AnsiballZ_dnf.py'
Feb 27 16:53:25 compute-0 sudo[53511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:25 compute-0 python3.9[53514]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:53:37 compute-0 kernel: SELinux:  Converting 2741 SID table entries...
Feb 27 16:53:37 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 27 16:53:37 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 27 16:53:37 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 27 16:53:37 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 27 16:53:37 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 27 16:53:37 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 27 16:53:37 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 27 16:53:37 compute-0 groupadd[53539]: group added to /etc/group: name=unbound, GID=994
Feb 27 16:53:37 compute-0 groupadd[53539]: group added to /etc/gshadow: name=unbound
Feb 27 16:53:37 compute-0 groupadd[53539]: new group: name=unbound, GID=994
Feb 27 16:53:37 compute-0 useradd[53546]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Feb 27 16:53:37 compute-0 dbus-broker-launch[787]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 27 16:53:37 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 27 16:53:39 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 27 16:53:39 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 27 16:53:39 compute-0 systemd[1]: Reloading.
Feb 27 16:53:39 compute-0 systemd-sysv-generator[54047]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:53:39 compute-0 systemd-rc-local-generator[54043]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:53:39 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 27 16:53:40 compute-0 sudo[53511]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:40 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 27 16:53:40 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 27 16:53:40 compute-0 systemd[1]: run-rb61dd361ea7f4f0898b66ad19d67f58a.service: Deactivated successfully.
Feb 27 16:53:40 compute-0 sudo[54637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgccrmdlsrjycpleiyuybuaihplqayua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211220.2453992-80-142256675854309/AnsiballZ_systemd.py'
Feb 27 16:53:40 compute-0 sudo[54637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:41 compute-0 python3.9[54640]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 27 16:53:41 compute-0 systemd[1]: Reloading.
Feb 27 16:53:41 compute-0 systemd-sysv-generator[54676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:53:41 compute-0 systemd-rc-local-generator[54668]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:53:41 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Feb 27 16:53:41 compute-0 chown[54689]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 27 16:53:41 compute-0 ovs-ctl[54694]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 27 16:53:41 compute-0 ovs-ctl[54694]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 27 16:53:41 compute-0 ovs-ctl[54694]: Starting ovsdb-server [  OK  ]
Feb 27 16:53:41 compute-0 ovs-vsctl[54743]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 27 16:53:41 compute-0 ovs-vsctl[54763]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"114486db-e8a8-4651-8c2f-bcfde6c6e156\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Feb 27 16:53:41 compute-0 ovs-ctl[54694]: Configuring Open vSwitch system IDs [  OK  ]
Feb 27 16:53:41 compute-0 ovs-ctl[54694]: Enabling remote OVSDB managers [  OK  ]
Feb 27 16:53:41 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Feb 27 16:53:41 compute-0 ovs-vsctl[54769]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 27 16:53:41 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 27 16:53:41 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 27 16:53:41 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 27 16:53:42 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Feb 27 16:53:42 compute-0 ovs-ctl[54814]: Inserting openvswitch module [  OK  ]
Feb 27 16:53:42 compute-0 ovs-ctl[54783]: Starting ovs-vswitchd [  OK  ]
Feb 27 16:53:42 compute-0 ovs-vsctl[54831]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Feb 27 16:53:42 compute-0 ovs-ctl[54783]: Enabling remote OVSDB managers [  OK  ]
Feb 27 16:53:42 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 27 16:53:42 compute-0 systemd[1]: Starting Open vSwitch...
Feb 27 16:53:42 compute-0 systemd[1]: Finished Open vSwitch.
Feb 27 16:53:42 compute-0 sudo[54637]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:43 compute-0 python3.9[54983]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:53:43 compute-0 sudo[55133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnibpfixhouznadbizuokztplsojwagt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211223.3312037-99-248374950123670/AnsiballZ_sefcontext.py'
Feb 27 16:53:43 compute-0 sudo[55133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:43 compute-0 python3.9[55136]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 27 16:53:45 compute-0 kernel: SELinux:  Converting 2755 SID table entries...
Feb 27 16:53:45 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 27 16:53:45 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 27 16:53:45 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 27 16:53:45 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 27 16:53:45 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 27 16:53:45 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 27 16:53:45 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 27 16:53:45 compute-0 sudo[55133]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:46 compute-0 python3.9[55291]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:53:46 compute-0 sudo[55447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-equecltbkwfzklthokrktvaoavmbejsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211226.587535-117-256026026054383/AnsiballZ_dnf.py'
Feb 27 16:53:46 compute-0 dbus-broker-launch[787]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 27 16:53:46 compute-0 sudo[55447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:47 compute-0 python3.9[55450]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:53:48 compute-0 sudo[55447]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:49 compute-0 sudo[55601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxxmmprdhodxybzqpmgnpkbvaimezzxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211229.1242838-125-96934635380639/AnsiballZ_command.py'
Feb 27 16:53:49 compute-0 sudo[55601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:49 compute-0 python3.9[55604]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:53:50 compute-0 sudo[55601]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:51 compute-0 sudo[55889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpzmzsvnybuawrjntwavizoohphrfemy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211230.6001334-133-169210100328035/AnsiballZ_file.py'
Feb 27 16:53:51 compute-0 sudo[55889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:51 compute-0 python3.9[55892]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 27 16:53:51 compute-0 sudo[55889]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:52 compute-0 python3.9[56042]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:53:52 compute-0 sudo[56194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebuqqrthghyammgpgblamofapbmptpsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211232.2384624-149-22298271241161/AnsiballZ_dnf.py'
Feb 27 16:53:52 compute-0 sudo[56194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:52 compute-0 python3.9[56197]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:53:54 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 27 16:53:54 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 27 16:53:54 compute-0 systemd[1]: Reloading.
Feb 27 16:53:54 compute-0 systemd-rc-local-generator[56234]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:53:54 compute-0 systemd-sysv-generator[56239]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:53:54 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 27 16:53:55 compute-0 sudo[56194]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:55 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 27 16:53:55 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 27 16:53:55 compute-0 systemd[1]: run-re36201eadb8a4576acd8ceade52fd06e.service: Deactivated successfully.
Feb 27 16:53:55 compute-0 sudo[56520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kynjoazwkkoadgxvlxtmidosikvdiazl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211235.4152265-157-39446016092197/AnsiballZ_systemd.py'
Feb 27 16:53:55 compute-0 sudo[56520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:56 compute-0 python3.9[56523]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 16:53:56 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 27 16:53:56 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Feb 27 16:53:56 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Feb 27 16:53:56 compute-0 systemd[1]: Stopping Network Manager...
Feb 27 16:53:56 compute-0 NetworkManager[7681]: <info>  [1772211236.0593] caught SIGTERM, shutting down normally.
Feb 27 16:53:56 compute-0 NetworkManager[7681]: <info>  [1772211236.0609] dhcp4 (eth0): canceled DHCP transaction
Feb 27 16:53:56 compute-0 NetworkManager[7681]: <info>  [1772211236.0609] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 27 16:53:56 compute-0 NetworkManager[7681]: <info>  [1772211236.0609] dhcp4 (eth0): state changed no lease
Feb 27 16:53:56 compute-0 NetworkManager[7681]: <info>  [1772211236.0613] manager: NetworkManager state is now CONNECTED_SITE
Feb 27 16:53:56 compute-0 NetworkManager[7681]: <info>  [1772211236.0744] exiting (success)
Feb 27 16:53:56 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 27 16:53:56 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 27 16:53:56 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 27 16:53:56 compute-0 systemd[1]: Stopped Network Manager.
Feb 27 16:53:56 compute-0 systemd[1]: NetworkManager.service: Consumed 11.689s CPU time, 4.1M memory peak, read 0B from disk, written 20.5K to disk.
Feb 27 16:53:56 compute-0 systemd[1]: Starting Network Manager...
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.1582] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:7134621a-8b85-4cf7-b630-8b1bd86c0689)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.1585] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.1651] manager[0x55788ec39000]: monitoring kernel firmware directory '/lib/firmware'.
Feb 27 16:53:56 compute-0 systemd[1]: Starting Hostname Service...
Feb 27 16:53:56 compute-0 systemd[1]: Started Hostname Service.
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2619] hostname: hostname: using hostnamed
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2620] hostname: static hostname changed from (none) to "compute-0"
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2625] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2630] manager[0x55788ec39000]: rfkill: Wi-Fi hardware radio set enabled
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2630] manager[0x55788ec39000]: rfkill: WWAN hardware radio set enabled
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2654] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2664] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2665] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2666] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2666] manager: Networking is enabled by state file
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2668] settings: Loaded settings plugin: keyfile (internal)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2672] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2701] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2712] dhcp: init: Using DHCP client 'internal'
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2715] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2721] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2728] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2737] device (lo): Activation: starting connection 'lo' (64e865f2-9b77-47c6-8998-ed75b7b9b4c4)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2743] device (eth0): carrier: link connected
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2747] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2752] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2753] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2761] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2768] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2774] device (eth1): carrier: link connected
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2778] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2783] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (4c61635a-3137-500c-b3cf-3399f56581ba) (indicated)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2783] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2790] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2798] device (eth1): Activation: starting connection 'ci-private-network' (4c61635a-3137-500c-b3cf-3399f56581ba)
Feb 27 16:53:56 compute-0 systemd[1]: Started Network Manager.
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2804] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2820] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2824] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2826] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2828] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2830] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2833] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2835] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2838] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2844] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2847] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2855] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.2867] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3207] dhcp4 (eth0): state changed new lease, address=38.129.56.53
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3216] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 27 16:53:56 compute-0 sudo[56520]: pam_unix(sudo:session): session closed for user root
Feb 27 16:53:56 compute-0 systemd[1]: Starting Network Manager Wait Online...
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3346] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3358] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3361] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3364] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3371] device (lo): Activation: successful, device activated.
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3381] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3386] manager: NetworkManager state is now CONNECTED_LOCAL
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3391] device (eth1): Activation: successful, device activated.
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3406] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3408] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3413] manager: NetworkManager state is now CONNECTED_SITE
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3417] device (eth0): Activation: successful, device activated.
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3425] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 27 16:53:56 compute-0 NetworkManager[56537]: <info>  [1772211236.3430] manager: startup complete
Feb 27 16:53:56 compute-0 systemd[1]: Finished Network Manager Wait Online.
Feb 27 16:53:56 compute-0 sudo[56748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbmiejjzaggfrxzjvtbdqeiefhjixwhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211236.5140743-165-268362320068594/AnsiballZ_dnf.py'
Feb 27 16:53:56 compute-0 sudo[56748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:53:57 compute-0 python3.9[56751]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:54:02 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 27 16:54:02 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 27 16:54:02 compute-0 systemd[1]: Reloading.
Feb 27 16:54:02 compute-0 systemd-sysv-generator[56805]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:54:02 compute-0 systemd-rc-local-generator[56802]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:54:02 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 27 16:54:03 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 27 16:54:03 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 27 16:54:03 compute-0 systemd[1]: run-rd434741348ac4b8baee9bb5f4dd9921f.service: Deactivated successfully.
Feb 27 16:54:03 compute-0 sudo[56748]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:04 compute-0 sudo[57227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etgyvspvjwewvyxvomsurnbjkllduwca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211243.9482174-177-266694286236449/AnsiballZ_stat.py'
Feb 27 16:54:04 compute-0 sudo[57227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:04 compute-0 python3.9[57230]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:54:04 compute-0 sudo[57227]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:05 compute-0 sudo[57380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwdittoeutxuracupgzlbqjlviqdxjwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211244.667445-186-275218453911170/AnsiballZ_ini_file.py'
Feb 27 16:54:05 compute-0 sudo[57380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:05 compute-0 python3.9[57383]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:05 compute-0 sudo[57380]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:05 compute-0 sudo[57535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhtnvjapctwsvxrglheihnbanonyplkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211245.5644844-196-220740602901584/AnsiballZ_ini_file.py'
Feb 27 16:54:05 compute-0 sudo[57535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:06 compute-0 python3.9[57538]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:06 compute-0 sudo[57535]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:06 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 27 16:54:06 compute-0 sudo[57688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unplzoszkyroiuoasctxrcmwvezokebt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211246.253217-196-169168810977846/AnsiballZ_ini_file.py'
Feb 27 16:54:06 compute-0 sudo[57688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:06 compute-0 python3.9[57691]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:06 compute-0 sudo[57688]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:07 compute-0 sudo[57841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qasryvylfpzvzdspqdwabnetaknayesx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211246.9913747-211-189236029985153/AnsiballZ_ini_file.py'
Feb 27 16:54:07 compute-0 sudo[57841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:07 compute-0 python3.9[57844]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:07 compute-0 sudo[57841]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:07 compute-0 sudo[57994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iidxywgyeelzsexqbeilpwhpvloxjtbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211247.645755-211-173478519090687/AnsiballZ_ini_file.py'
Feb 27 16:54:07 compute-0 sudo[57994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:08 compute-0 python3.9[57997]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:08 compute-0 sudo[57994]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:08 compute-0 sudo[58147]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whnkotmwwhixnfgkhgsqzjyqsmiflaoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211248.332046-226-13452449664761/AnsiballZ_stat.py'
Feb 27 16:54:08 compute-0 sudo[58147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:08 compute-0 python3.9[58150]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:54:08 compute-0 sudo[58147]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:09 compute-0 sudo[58271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgbqqtcjhylzzvjpphtirwowfsetaqna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211248.332046-226-13452449664761/AnsiballZ_copy.py'
Feb 27 16:54:09 compute-0 sudo[58271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:09 compute-0 python3.9[58274]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211248.332046-226-13452449664761/.source _original_basename=.175_j2k5 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:09 compute-0 sudo[58271]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:10 compute-0 sudo[58424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmcyrjgfuxhmanvoecewrstjysbcenpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211249.810561-241-57216992346949/AnsiballZ_file.py'
Feb 27 16:54:10 compute-0 sudo[58424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:10 compute-0 python3.9[58427]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:10 compute-0 sudo[58424]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:10 compute-0 sudo[58577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlkuwrvlkkywecfqzgaqfhjwenqsmbyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211250.5664573-249-167751988506770/AnsiballZ_edpm_os_net_config_mappings.py'
Feb 27 16:54:10 compute-0 sudo[58577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:11 compute-0 python3.9[58580]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 27 16:54:11 compute-0 sudo[58577]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:11 compute-0 sudo[58730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jglaxswrasmnhtvxbjemxrlarhsgvuvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211251.4144695-258-76247952730201/AnsiballZ_file.py'
Feb 27 16:54:11 compute-0 sudo[58730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:11 compute-0 python3.9[58733]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:11 compute-0 sudo[58730]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:12 compute-0 sudo[58883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzsvsghwgiayzbpajrgpcmtrwsapsych ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211252.2462373-268-129994042610103/AnsiballZ_stat.py'
Feb 27 16:54:12 compute-0 sudo[58883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:12 compute-0 sudo[58883]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:13 compute-0 sudo[59007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzmifbentthyfomuayqkktwwthicgmst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211252.2462373-268-129994042610103/AnsiballZ_copy.py'
Feb 27 16:54:13 compute-0 sudo[59007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:13 compute-0 sudo[59007]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:13 compute-0 sudo[59160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llqhglirtobvfhauaywccmqwfxoiwlny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211253.500529-283-174357655348526/AnsiballZ_slurp.py'
Feb 27 16:54:13 compute-0 sudo[59160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:14 compute-0 python3.9[59164]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 27 16:54:14 compute-0 sudo[59160]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:15 compute-0 sudo[59337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whnxtnhzvpdttmpesgkgszcyluwdalsy ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211254.3354235-292-152344405085857/async_wrapper.py j386025950692 300 /home/zuul/.ansible/tmp/ansible-tmp-1772211254.3354235-292-152344405085857/AnsiballZ_edpm_os_net_config.py _'
Feb 27 16:54:15 compute-0 sudo[59337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:15 compute-0 ansible-async_wrapper.py[59340]: Invoked with j386025950692 300 /home/zuul/.ansible/tmp/ansible-tmp-1772211254.3354235-292-152344405085857/AnsiballZ_edpm_os_net_config.py _
Feb 27 16:54:15 compute-0 ansible-async_wrapper.py[59343]: Starting module and watcher
Feb 27 16:54:15 compute-0 ansible-async_wrapper.py[59343]: Start watching 59344 (300)
Feb 27 16:54:15 compute-0 ansible-async_wrapper.py[59344]: Start module (59344)
Feb 27 16:54:15 compute-0 ansible-async_wrapper.py[59340]: Return async_wrapper task started.
Feb 27 16:54:15 compute-0 sudo[59337]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:15 compute-0 python3.9[59345]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=True purge_provider=
Feb 27 16:54:16 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 27 16:54:16 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 27 16:54:16 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Feb 27 16:54:16 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 27 16:54:16 compute-0 kernel: cfg80211: failed to load regulatory.db
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.4368] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.4381] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.4988] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.4990] audit: op="connection-add" uuid="00338e49-ee03-4eee-ba80-fc0d7429070b" name="br-ex-br" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5005] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5007] audit: op="connection-add" uuid="ac9bf0f8-eaac-4604-acfa-2adb6bd9800f" name="br-ex-port" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5020] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5021] audit: op="connection-add" uuid="23cd4aaa-de91-43da-94f6-0993ce8faa71" name="eth1-port" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5034] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5036] audit: op="connection-add" uuid="f7bb927f-f7cb-408b-8196-821a196da601" name="vlan20-port" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5049] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5050] audit: op="connection-add" uuid="665a6b6d-0639-4f80-a528-1737475327c8" name="vlan21-port" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5063] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5065] audit: op="connection-add" uuid="573d2f84-70d6-4ad9-a7db-ce1d26aec78c" name="vlan22-port" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5084] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5101] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5103] audit: op="connection-add" uuid="f02707d9-e4fe-4669-9cbc-5a2cf427f827" name="br-ex-if" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5137] audit: op="connection-update" uuid="4c61635a-3137-500c-b3cf-3399f56581ba" name="ci-private-network" args="ipv6.addresses,ipv6.addr-gen-mode,ipv6.dns,ipv6.routes,ipv6.method,ipv6.routing-rules,ovs-interface.type,ovs-external-ids.data,connection.port-type,connection.master,connection.controller,connection.slave-type,connection.timestamp,ipv4.addresses,ipv4.never-default,ipv4.dns,ipv4.routes,ipv4.method,ipv4.routing-rules" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5153] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5155] audit: op="connection-add" uuid="e5b0b47e-a9e1-403f-84e6-27c9d8b1e239" name="vlan20-if" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5172] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5174] audit: op="connection-add" uuid="a1d851b2-e26d-4371-882a-7060a5d3b49d" name="vlan21-if" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5191] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5193] audit: op="connection-add" uuid="de47d997-9e8f-4c7b-b230-8142814791aa" name="vlan22-if" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5206] audit: op="connection-delete" uuid="281fa0e3-e3e5-328a-b833-575ca1dafd63" name="Wired connection 1" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5219] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <warn>  [1772211257.5221] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5229] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5232] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (00338e49-ee03-4eee-ba80-fc0d7429070b)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5232] audit: op="connection-activate" uuid="00338e49-ee03-4eee-ba80-fc0d7429070b" name="br-ex-br" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5234] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <warn>  [1772211257.5236] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5241] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5246] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (ac9bf0f8-eaac-4604-acfa-2adb6bd9800f)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5249] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <warn>  [1772211257.5250] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5255] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5260] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (23cd4aaa-de91-43da-94f6-0993ce8faa71)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5262] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <warn>  [1772211257.5263] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5270] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5275] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (f7bb927f-f7cb-408b-8196-821a196da601)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5277] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <warn>  [1772211257.5278] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5284] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5289] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (665a6b6d-0639-4f80-a528-1737475327c8)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5291] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <warn>  [1772211257.5292] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5298] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5303] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (573d2f84-70d6-4ad9-a7db-ce1d26aec78c)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5303] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5306] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5309] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5315] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <warn>  [1772211257.5316] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5319] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5325] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (f02707d9-e4fe-4669-9cbc-5a2cf427f827)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5326] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5330] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5333] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5334] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5335] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5346] device (eth1): disconnecting for new activation request.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5347] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5348] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5350] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5351] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5353] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <warn>  [1772211257.5353] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5355] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5358] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (e5b0b47e-a9e1-403f-84e6-27c9d8b1e239)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5358] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5360] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5361] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5362] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5364] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <warn>  [1772211257.5365] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5367] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5369] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (a1d851b2-e26d-4371-882a-7060a5d3b49d)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5369] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5371] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5373] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5373] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5375] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <warn>  [1772211257.5376] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5377] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5380] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (de47d997-9e8f-4c7b-b230-8142814791aa)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5380] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5382] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5384] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5384] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5385] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5396] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.addr-gen-mode,ipv6.method,connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5397] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5399] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5400] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5405] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5408] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5411] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5413] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5414] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5418] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5421] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5423] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5424] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5428] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5431] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 kernel: ovs-system: entered promiscuous mode
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5433] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5434] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5438] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5442] dhcp4 (eth0): canceled DHCP transaction
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5443] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5443] dhcp4 (eth0): state changed no lease
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5444] dhcp4 (eth0): activation: beginning transaction (no timeout)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5454] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Feb 27 16:54:17 compute-0 systemd-udevd[59349]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 16:54:17 compute-0 kernel: Timeout policy base is empty
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5539] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5544] dhcp4 (eth0): state changed new lease, address=38.129.56.53
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5547] audit: op="device-reapply" interface="eth1" ifindex=3 pid=59346 uid=0 result="fail" reason="Device is not activated"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5549] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Feb 27 16:54:17 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5606] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5631] device (eth1): disconnecting for new activation request.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5632] audit: op="connection-activate" uuid="4c61635a-3137-500c-b3cf-3399f56581ba" name="ci-private-network" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5655] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59346 uid=0 result="success"
Feb 27 16:54:17 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5707] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5810] device (eth1): Activation: starting connection 'ci-private-network' (4c61635a-3137-500c-b3cf-3399f56581ba)
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5816] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 kernel: br-ex: entered promiscuous mode
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5826] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5829] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5837] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5839] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5842] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5844] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5845] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5845] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5846] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5866] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5873] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5876] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5880] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5885] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5889] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5893] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5897] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5900] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5905] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5909] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5914] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5919] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 kernel: vlan22: entered promiscuous mode
Feb 27 16:54:17 compute-0 systemd-udevd[59351]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5987] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5993] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.5998] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6001] device (eth1): Activation: successful, device activated.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6012] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 kernel: vlan21: entered promiscuous mode
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6035] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6036] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6039] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6087] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6094] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 kernel: vlan20: entered promiscuous mode
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6118] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Feb 27 16:54:17 compute-0 systemd-udevd[59350]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6129] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6140] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6141] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6146] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 27 16:54:17 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6231] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6233] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6236] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6241] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6257] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6292] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6294] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Feb 27 16:54:17 compute-0 NetworkManager[56537]: <info>  [1772211257.6299] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Feb 27 16:54:18 compute-0 NetworkManager[56537]: <info>  [1772211258.7477] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59346 uid=0 result="success"
Feb 27 16:54:18 compute-0 NetworkManager[56537]: <info>  [1772211258.9249] checkpoint[0x55788ec0e950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Feb 27 16:54:18 compute-0 NetworkManager[56537]: <info>  [1772211258.9251] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59346 uid=0 result="success"
Feb 27 16:54:19 compute-0 sudo[59678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldterwesjdpweyjvshetmgazwnnklscn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211258.6088362-292-212469226728467/AnsiballZ_async_status.py'
Feb 27 16:54:19 compute-0 NetworkManager[56537]: <info>  [1772211259.1015] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59346 uid=0 result="success"
Feb 27 16:54:19 compute-0 NetworkManager[56537]: <info>  [1772211259.1022] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59346 uid=0 result="success"
Feb 27 16:54:19 compute-0 sudo[59678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:19 compute-0 NetworkManager[56537]: <info>  [1772211259.2476] audit: op="networking-control" arg="global-dns-configuration" pid=59346 uid=0 result="success"
Feb 27 16:54:19 compute-0 NetworkManager[56537]: <info>  [1772211259.2509] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Feb 27 16:54:19 compute-0 NetworkManager[56537]: <info>  [1772211259.2546] audit: op="networking-control" arg="global-dns-configuration" pid=59346 uid=0 result="success"
Feb 27 16:54:19 compute-0 NetworkManager[56537]: <info>  [1772211259.2571] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59346 uid=0 result="success"
Feb 27 16:54:19 compute-0 python3.9[59681]: ansible-ansible.legacy.async_status Invoked with jid=j386025950692.59340 mode=status _async_dir=/root/.ansible_async
Feb 27 16:54:19 compute-0 sudo[59678]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:19 compute-0 NetworkManager[56537]: <info>  [1772211259.4384] checkpoint[0x55788ec0ea20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Feb 27 16:54:19 compute-0 NetworkManager[56537]: <info>  [1772211259.4390] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59346 uid=0 result="success"
Feb 27 16:54:19 compute-0 ansible-async_wrapper.py[59344]: Module complete (59344)
Feb 27 16:54:20 compute-0 ansible-async_wrapper.py[59343]: Done in kid B.
Feb 27 16:54:22 compute-0 sudo[59783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmiiryrinfialfznnrxkyrczblcfyhic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211258.6088362-292-212469226728467/AnsiballZ_async_status.py'
Feb 27 16:54:22 compute-0 sudo[59783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:22 compute-0 python3.9[59786]: ansible-ansible.legacy.async_status Invoked with jid=j386025950692.59340 mode=status _async_dir=/root/.ansible_async
Feb 27 16:54:22 compute-0 sudo[59783]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:23 compute-0 sudo[59884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfkkotbgaebhulqeekahylcgjwbkdfhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211258.6088362-292-212469226728467/AnsiballZ_async_status.py'
Feb 27 16:54:23 compute-0 sudo[59884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:23 compute-0 python3.9[59887]: ansible-ansible.legacy.async_status Invoked with jid=j386025950692.59340 mode=cleanup _async_dir=/root/.ansible_async
Feb 27 16:54:23 compute-0 sudo[59884]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:23 compute-0 sudo[60037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lovocnahlufjeucfawmbhuenwlkpinit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211263.6103108-319-159629502986376/AnsiballZ_stat.py'
Feb 27 16:54:23 compute-0 sudo[60037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:24 compute-0 python3.9[60040]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:54:24 compute-0 sudo[60037]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:24 compute-0 sudo[60161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohwsijnkkndxqspqhswwolopnbgvukhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211263.6103108-319-159629502986376/AnsiballZ_copy.py'
Feb 27 16:54:24 compute-0 sudo[60161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:24 compute-0 python3.9[60164]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211263.6103108-319-159629502986376/.source.returncode _original_basename=.i9_r07ay follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:24 compute-0 sudo[60161]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:25 compute-0 sudo[60314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dooopaxxshbycofomcuvtrpqpdmyzdbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211264.948071-335-6353131658426/AnsiballZ_stat.py'
Feb 27 16:54:25 compute-0 sudo[60314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:25 compute-0 python3.9[60317]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:54:25 compute-0 sudo[60314]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:25 compute-0 sudo[60438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddimllejytblnthhzfgbjkcbkfgadlqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211264.948071-335-6353131658426/AnsiballZ_copy.py'
Feb 27 16:54:25 compute-0 sudo[60438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:26 compute-0 python3.9[60442]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211264.948071-335-6353131658426/.source.cfg _original_basename=.wjnxn_24 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:26 compute-0 sudo[60438]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:26 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 27 16:54:26 compute-0 sudo[60594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcgejhxlwdcwvnmytzzqxntpppjehqnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211266.3497074-350-131549741147714/AnsiballZ_systemd.py'
Feb 27 16:54:26 compute-0 sudo[60594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:26 compute-0 python3.9[60597]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 16:54:27 compute-0 systemd[1]: Reloading Network Manager...
Feb 27 16:54:27 compute-0 NetworkManager[56537]: <info>  [1772211267.0768] audit: op="reload" arg="0" pid=60601 uid=0 result="success"
Feb 27 16:54:27 compute-0 NetworkManager[56537]: <info>  [1772211267.0777] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Feb 27 16:54:27 compute-0 systemd[1]: Reloaded Network Manager.
Feb 27 16:54:27 compute-0 sudo[60594]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:27 compute-0 sshd-session[52485]: Connection closed by 192.168.122.30 port 44916
Feb 27 16:54:27 compute-0 sshd-session[52482]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:54:27 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Feb 27 16:54:27 compute-0 systemd[1]: session-11.scope: Consumed 46.939s CPU time.
Feb 27 16:54:27 compute-0 systemd-logind[803]: Session 11 logged out. Waiting for processes to exit.
Feb 27 16:54:27 compute-0 systemd-logind[803]: Removed session 11.
Feb 27 16:54:32 compute-0 sshd-session[60632]: Accepted publickey for zuul from 192.168.122.30 port 44662 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:54:32 compute-0 systemd-logind[803]: New session 12 of user zuul.
Feb 27 16:54:32 compute-0 systemd[1]: Started Session 12 of User zuul.
Feb 27 16:54:32 compute-0 sshd-session[60632]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:54:33 compute-0 python3.9[60785]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:54:34 compute-0 python3.9[60939]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:54:35 compute-0 python3.9[61129]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:54:36 compute-0 sshd-session[60635]: Connection closed by 192.168.122.30 port 44662
Feb 27 16:54:36 compute-0 sshd-session[60632]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:54:36 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Feb 27 16:54:36 compute-0 systemd[1]: session-12.scope: Consumed 2.144s CPU time.
Feb 27 16:54:36 compute-0 systemd-logind[803]: Session 12 logged out. Waiting for processes to exit.
Feb 27 16:54:36 compute-0 systemd-logind[803]: Removed session 12.
Feb 27 16:54:37 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 27 16:54:42 compute-0 sshd-session[61158]: Accepted publickey for zuul from 192.168.122.30 port 41704 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:54:42 compute-0 systemd-logind[803]: New session 13 of user zuul.
Feb 27 16:54:42 compute-0 systemd[1]: Started Session 13 of User zuul.
Feb 27 16:54:42 compute-0 sshd-session[61158]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:54:43 compute-0 python3.9[61311]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:54:44 compute-0 python3.9[61465]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:54:45 compute-0 sudo[61620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aurumtrvtzvvpuyasmspamslugzefuyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211285.2214422-35-264700303027029/AnsiballZ_setup.py'
Feb 27 16:54:45 compute-0 sudo[61620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:45 compute-0 python3.9[61623]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:54:46 compute-0 sudo[61620]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:46 compute-0 sudo[61705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcxsbrhsaxynvdncumalefbshhlkmynu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211285.2214422-35-264700303027029/AnsiballZ_dnf.py'
Feb 27 16:54:46 compute-0 sudo[61705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:46 compute-0 python3.9[61708]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:54:47 compute-0 sudo[61705]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:48 compute-0 sudo[61860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fngicgsfbskaxihqxgbbntkxhguvzddz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211288.1589205-47-252216094032451/AnsiballZ_setup.py'
Feb 27 16:54:48 compute-0 sudo[61860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:48 compute-0 python3.9[61863]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:54:49 compute-0 sudo[61860]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:49 compute-0 sudo[62052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcfcoxjqagrascnmumaogelggfpofist ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211289.350313-58-196509626059667/AnsiballZ_file.py'
Feb 27 16:54:49 compute-0 sudo[62052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:49 compute-0 python3.9[62055]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:49 compute-0 sudo[62052]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:50 compute-0 sudo[62205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mweywcraxipzuilvngtqxoycqnfkhdtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211290.156378-66-204240592790030/AnsiballZ_command.py'
Feb 27 16:54:50 compute-0 sudo[62205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:50 compute-0 python3.9[62208]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:54:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:54:50 compute-0 sudo[62205]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:51 compute-0 sudo[62369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouyzzduvqbdvyybvsiryqvwwehambttq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211291.0304697-74-15146400307969/AnsiballZ_stat.py'
Feb 27 16:54:51 compute-0 sudo[62369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:51 compute-0 python3.9[62372]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:54:51 compute-0 sudo[62369]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:51 compute-0 sudo[62448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgznzzclgtzvkelzhoeernoxmfzzjixd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211291.0304697-74-15146400307969/AnsiballZ_file.py'
Feb 27 16:54:51 compute-0 sudo[62448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:52 compute-0 python3.9[62451]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:54:52 compute-0 sudo[62448]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:52 compute-0 sudo[62601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjydxdiaatyvkolrzhoguyhvizwlsrdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211292.246299-86-50042215521393/AnsiballZ_stat.py'
Feb 27 16:54:52 compute-0 sudo[62601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:52 compute-0 python3.9[62604]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:54:52 compute-0 sudo[62601]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:53 compute-0 sudo[62680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptjlhwalnykzieegvlnzxtqkrsckbqoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211292.246299-86-50042215521393/AnsiballZ_file.py'
Feb 27 16:54:53 compute-0 sudo[62680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:53 compute-0 python3.9[62683]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:54:53 compute-0 sudo[62680]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:53 compute-0 sudo[62833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnztanyrirvlsvhcrlxupownsxfewdaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211293.5695162-99-251005814136777/AnsiballZ_ini_file.py'
Feb 27 16:54:53 compute-0 sudo[62833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:54 compute-0 python3.9[62836]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:54:54 compute-0 sudo[62833]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:54 compute-0 sudo[62986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avrqmgokuvoespezzzvfyhokqzkbnhgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211294.3325684-99-238958452980003/AnsiballZ_ini_file.py'
Feb 27 16:54:54 compute-0 sudo[62986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:54 compute-0 python3.9[62989]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:54:54 compute-0 sudo[62986]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:55 compute-0 sudo[63139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-easdldeiechykwaaiytiyfgvbrkqfwgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211294.9777093-99-7828982430048/AnsiballZ_ini_file.py'
Feb 27 16:54:55 compute-0 sudo[63139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:55 compute-0 python3.9[63142]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:54:55 compute-0 sudo[63139]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:55 compute-0 sudo[63292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijqlssgmmndiurvwwscuidpalsihnkxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211295.6436906-99-220445678141583/AnsiballZ_ini_file.py'
Feb 27 16:54:55 compute-0 sudo[63292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:56 compute-0 python3.9[63295]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:54:56 compute-0 sudo[63292]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:56 compute-0 sudo[63445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmjiwxmyucyyfejmmlgguiiyjwfzhkxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211296.3532054-130-194093057467286/AnsiballZ_dnf.py'
Feb 27 16:54:56 compute-0 sudo[63445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:56 compute-0 python3.9[63448]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:54:57 compute-0 sudo[63445]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:58 compute-0 sudo[63599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvfnuhevvgfcbpsxawrrcoxutktucxnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211298.483413-141-235161454614781/AnsiballZ_setup.py'
Feb 27 16:54:58 compute-0 sudo[63599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:59 compute-0 python3.9[63602]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:54:59 compute-0 sudo[63599]: pam_unix(sudo:session): session closed for user root
Feb 27 16:54:59 compute-0 sudo[63754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osnmptpuxjbxgmvweagrejggihwsvjno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211299.3298678-149-222106615096512/AnsiballZ_stat.py'
Feb 27 16:54:59 compute-0 sudo[63754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:54:59 compute-0 python3.9[63757]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:54:59 compute-0 sudo[63754]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:00 compute-0 sudo[63907]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gbricevtrrgfwgnhxtzweeishfyjavhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211300.022748-158-101862401710628/AnsiballZ_stat.py'
Feb 27 16:55:00 compute-0 sudo[63907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:00 compute-0 python3.9[63910]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:55:00 compute-0 sudo[63907]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:01 compute-0 sudo[64060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oifufrlajfiifqiiakghixwuzhdivkhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211300.895322-168-175194021747951/AnsiballZ_command.py'
Feb 27 16:55:01 compute-0 sudo[64060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:01 compute-0 python3.9[64063]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:55:01 compute-0 sudo[64060]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:02 compute-0 sudo[64214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmetmjmjoteekuatbomswymrxrvhelmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211301.7083426-178-57379142779231/AnsiballZ_service_facts.py'
Feb 27 16:55:02 compute-0 sudo[64214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:02 compute-0 python3.9[64217]: ansible-service_facts Invoked
Feb 27 16:55:02 compute-0 network[64234]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 27 16:55:02 compute-0 network[64235]: 'network-scripts' will be removed from distribution in near future.
Feb 27 16:55:02 compute-0 network[64236]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 27 16:55:04 compute-0 sudo[64214]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:06 compute-0 sudo[64520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyvngxvphwoxfyqxbjkrslprtaxccecu ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1772211305.6443586-193-199539595298911/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1772211305.6443586-193-199539595298911/args'
Feb 27 16:55:06 compute-0 sudo[64520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:06 compute-0 sudo[64520]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:06 compute-0 sudo[64688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpjxznrpajrwxlwuklsbfcadhjtibqnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211306.4006398-204-136052208943948/AnsiballZ_dnf.py'
Feb 27 16:55:06 compute-0 sudo[64688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:06 compute-0 python3.9[64691]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:55:08 compute-0 sudo[64688]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:09 compute-0 sudo[64842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxrpogvdwhodzhyuchmkjpovbvnlgcjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211308.5089939-217-231446114421430/AnsiballZ_package_facts.py'
Feb 27 16:55:09 compute-0 sudo[64842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:09 compute-0 python3.9[64845]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 27 16:55:09 compute-0 sudo[64842]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:10 compute-0 sudo[64995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujlrmumuxvhwaniebznfnwnetdkenhqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211310.2014506-227-110607289027509/AnsiballZ_stat.py'
Feb 27 16:55:10 compute-0 sudo[64995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:10 compute-0 python3.9[64998]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:10 compute-0 sudo[64995]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:11 compute-0 sudo[65121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orssnkuysrvrerhkwfhonqyzzjuvqfgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211310.2014506-227-110607289027509/AnsiballZ_copy.py'
Feb 27 16:55:11 compute-0 sudo[65121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:11 compute-0 python3.9[65124]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211310.2014506-227-110607289027509/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:11 compute-0 sudo[65121]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:12 compute-0 sudo[65276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wygytytmsbshqvnowayztltqjvumgyua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211311.9690623-242-204168994651526/AnsiballZ_stat.py'
Feb 27 16:55:12 compute-0 sudo[65276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:12 compute-0 python3.9[65279]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:12 compute-0 sudo[65276]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:12 compute-0 sudo[65402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kytaratkcpfqqzvekwzgmokegqqottel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211311.9690623-242-204168994651526/AnsiballZ_copy.py'
Feb 27 16:55:12 compute-0 sudo[65402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:13 compute-0 python3.9[65405]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211311.9690623-242-204168994651526/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:13 compute-0 sudo[65402]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:14 compute-0 sudo[65557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgnvrziinlehkiesxzusyubtcwlxjsdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211313.6660652-263-30371627765492/AnsiballZ_lineinfile.py'
Feb 27 16:55:14 compute-0 sudo[65557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:14 compute-0 python3.9[65560]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:14 compute-0 sudo[65557]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:15 compute-0 sudo[65712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgsjvxwtvngbfsitekzypgokbhgyuaxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211315.1528103-278-278266779048927/AnsiballZ_setup.py'
Feb 27 16:55:15 compute-0 sudo[65712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:15 compute-0 python3.9[65715]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:55:16 compute-0 sudo[65712]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:16 compute-0 sudo[65797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcadglodcghxgfpwubnzantrijnfgsfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211315.1528103-278-278266779048927/AnsiballZ_systemd.py'
Feb 27 16:55:16 compute-0 sudo[65797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:17 compute-0 python3.9[65800]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:55:17 compute-0 sudo[65797]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:17 compute-0 sudo[65952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvaeiciciwnonadrsbcdzeqifqegljtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211317.6815653-294-245406860135972/AnsiballZ_setup.py'
Feb 27 16:55:17 compute-0 sudo[65952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:18 compute-0 python3.9[65955]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:55:18 compute-0 sudo[65952]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:18 compute-0 sudo[66037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzcglotfmketsvklfybyarnbcpkqsidx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211317.6815653-294-245406860135972/AnsiballZ_systemd.py'
Feb 27 16:55:18 compute-0 sudo[66037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:19 compute-0 python3.9[66040]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 16:55:19 compute-0 chronyd[817]: chronyd exiting
Feb 27 16:55:19 compute-0 systemd[1]: Stopping NTP client/server...
Feb 27 16:55:19 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Feb 27 16:55:19 compute-0 systemd[1]: Stopped NTP client/server.
Feb 27 16:55:19 compute-0 systemd[1]: Starting NTP client/server...
Feb 27 16:55:19 compute-0 chronyd[66048]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Feb 27 16:55:19 compute-0 chronyd[66048]: Frequency -24.748 +/- 0.250 ppm read from /var/lib/chrony/drift
Feb 27 16:55:19 compute-0 chronyd[66048]: Loaded seccomp filter (level 2)
Feb 27 16:55:19 compute-0 systemd[1]: Started NTP client/server.
Feb 27 16:55:19 compute-0 sudo[66037]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:19 compute-0 sshd-session[61161]: Connection closed by 192.168.122.30 port 41704
Feb 27 16:55:19 compute-0 sshd-session[61158]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:55:19 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Feb 27 16:55:19 compute-0 systemd[1]: session-13.scope: Consumed 24.475s CPU time.
Feb 27 16:55:19 compute-0 systemd-logind[803]: Session 13 logged out. Waiting for processes to exit.
Feb 27 16:55:19 compute-0 systemd-logind[803]: Removed session 13.
Feb 27 16:55:25 compute-0 sshd-session[66074]: Accepted publickey for zuul from 192.168.122.30 port 40606 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:55:25 compute-0 systemd-logind[803]: New session 14 of user zuul.
Feb 27 16:55:25 compute-0 systemd[1]: Started Session 14 of User zuul.
Feb 27 16:55:25 compute-0 sshd-session[66074]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:55:26 compute-0 python3.9[66227]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:55:27 compute-0 sudo[66381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzgmodpsxxrfslqqozlvymvqdywgrfew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211326.9724426-28-261237957161827/AnsiballZ_file.py'
Feb 27 16:55:27 compute-0 sudo[66381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:27 compute-0 python3.9[66384]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:27 compute-0 sudo[66381]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:28 compute-0 sudo[66557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zncgwzelaotlugqvoshlbreimdkekrnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211327.8317673-36-258965295537693/AnsiballZ_stat.py'
Feb 27 16:55:28 compute-0 sudo[66557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:28 compute-0 python3.9[66560]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:28 compute-0 sudo[66557]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:28 compute-0 sudo[66636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adpbqjvljhwhsobmtukexqjraxxrhzfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211327.8317673-36-258965295537693/AnsiballZ_file.py'
Feb 27 16:55:28 compute-0 sudo[66636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:28 compute-0 python3.9[66639]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.j57v6n8t recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:28 compute-0 sudo[66636]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:29 compute-0 sudo[66789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzlwgthknayyyeuiyncmlobatrwtubkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211329.3843174-56-53272339434344/AnsiballZ_stat.py'
Feb 27 16:55:29 compute-0 sudo[66789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:30 compute-0 python3.9[66792]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:30 compute-0 sudo[66789]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:30 compute-0 sudo[66913]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsaxdjnbflqqrdmxbjqaikzookzrvexu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211329.3843174-56-53272339434344/AnsiballZ_copy.py'
Feb 27 16:55:30 compute-0 sudo[66913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:30 compute-0 python3.9[66916]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211329.3843174-56-53272339434344/.source _original_basename=.wlfil80z follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:30 compute-0 sudo[66913]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:31 compute-0 sudo[67066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwmineichvhefxdyufnbcgikjikhbxzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211331.1223059-72-259932894778189/AnsiballZ_file.py'
Feb 27 16:55:31 compute-0 sudo[67066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:31 compute-0 python3.9[67069]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:55:31 compute-0 sudo[67066]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:32 compute-0 sudo[67219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivygoxgjwlunhdlnfmyxvtqhuypwmizf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211331.835147-80-68091596465355/AnsiballZ_stat.py'
Feb 27 16:55:32 compute-0 sudo[67219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:32 compute-0 python3.9[67222]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:32 compute-0 sudo[67219]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:32 compute-0 sudo[67343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccswvhezcvcyejrlethosfgueslvuwkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211331.835147-80-68091596465355/AnsiballZ_copy.py'
Feb 27 16:55:32 compute-0 sudo[67343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:32 compute-0 python3.9[67346]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211331.835147-80-68091596465355/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:55:32 compute-0 sudo[67343]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:33 compute-0 sudo[67496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfpdzdkplogoluyuvocnpdmtowttrynr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211333.1327214-80-170299607997490/AnsiballZ_stat.py'
Feb 27 16:55:33 compute-0 sudo[67496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:33 compute-0 python3.9[67499]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:33 compute-0 sudo[67496]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:34 compute-0 sudo[67620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhckhsoaajwaejbqtvbhlhloehsutdwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211333.1327214-80-170299607997490/AnsiballZ_copy.py'
Feb 27 16:55:34 compute-0 sudo[67620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:34 compute-0 python3.9[67623]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211333.1327214-80-170299607997490/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:55:34 compute-0 sudo[67620]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:34 compute-0 sudo[67773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-javibczaogmbewmlmozrshgapwhwjyes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211334.3730695-109-30390085306053/AnsiballZ_file.py'
Feb 27 16:55:34 compute-0 sudo[67773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:34 compute-0 python3.9[67776]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:34 compute-0 sudo[67773]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:35 compute-0 sudo[67926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggabncqrckndqeojqtsgzfxjxbujcabq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211335.0641172-117-201358928493005/AnsiballZ_stat.py'
Feb 27 16:55:35 compute-0 sudo[67926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:35 compute-0 python3.9[67929]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:35 compute-0 sudo[67926]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:35 compute-0 sudo[68050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvtqoqkiaamrrxulqhgtekeohlzxynbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211335.0641172-117-201358928493005/AnsiballZ_copy.py'
Feb 27 16:55:35 compute-0 sudo[68050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:36 compute-0 python3.9[68053]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211335.0641172-117-201358928493005/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:36 compute-0 sudo[68050]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:36 compute-0 sudo[68203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbmfbqhrbwlneqizdspszexobjrtjtqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211336.325068-132-94654438542677/AnsiballZ_stat.py'
Feb 27 16:55:36 compute-0 sudo[68203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:36 compute-0 python3.9[68206]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:36 compute-0 sudo[68203]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:37 compute-0 sudo[68327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgcfkporvjrkvjwgpyokermezyovpuku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211336.325068-132-94654438542677/AnsiballZ_copy.py'
Feb 27 16:55:37 compute-0 sudo[68327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:37 compute-0 python3.9[68330]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211336.325068-132-94654438542677/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:37 compute-0 sudo[68327]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:38 compute-0 sudo[68480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhwiunxesrwtxygtrbrsnvfubianbqkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211337.6129413-147-20150710327443/AnsiballZ_systemd.py'
Feb 27 16:55:38 compute-0 sudo[68480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:38 compute-0 python3.9[68483]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:55:38 compute-0 systemd[1]: Reloading.
Feb 27 16:55:38 compute-0 systemd-sysv-generator[68512]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:55:38 compute-0 systemd-rc-local-generator[68506]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:55:38 compute-0 systemd[1]: Reloading.
Feb 27 16:55:38 compute-0 systemd-rc-local-generator[68553]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:55:38 compute-0 systemd-sysv-generator[68559]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:55:38 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Feb 27 16:55:38 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Feb 27 16:55:38 compute-0 sudo[68480]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:39 compute-0 sudo[68723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsoltsbumclnzfdzzmbajigrpcdnknpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211339.12882-155-266576081994965/AnsiballZ_stat.py'
Feb 27 16:55:39 compute-0 sudo[68723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:39 compute-0 python3.9[68726]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:39 compute-0 sudo[68723]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:39 compute-0 sudo[68847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obaobupmxkmvltuixyvkulikswuajcop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211339.12882-155-266576081994965/AnsiballZ_copy.py'
Feb 27 16:55:39 compute-0 sudo[68847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:40 compute-0 python3.9[68850]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211339.12882-155-266576081994965/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:40 compute-0 sudo[68847]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:40 compute-0 sudo[69000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctueydvcavoapdldmmfdfnrvurxeeqwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211340.3514423-170-12727145170809/AnsiballZ_stat.py'
Feb 27 16:55:40 compute-0 sudo[69000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:40 compute-0 python3.9[69003]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:40 compute-0 sudo[69000]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:41 compute-0 sudo[69124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppkivyopiwadokdrxgvnkeegormhwqrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211340.3514423-170-12727145170809/AnsiballZ_copy.py'
Feb 27 16:55:41 compute-0 sudo[69124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:41 compute-0 python3.9[69127]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211340.3514423-170-12727145170809/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:41 compute-0 sudo[69124]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:41 compute-0 sudo[69277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpfeslhcutznzuqwviesdhpuilkjabhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211341.615238-185-253552429365133/AnsiballZ_systemd.py'
Feb 27 16:55:41 compute-0 sudo[69277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:42 compute-0 python3.9[69280]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:55:42 compute-0 systemd[1]: Reloading.
Feb 27 16:55:42 compute-0 systemd-rc-local-generator[69305]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:55:42 compute-0 systemd-sysv-generator[69310]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:55:42 compute-0 systemd[1]: Reloading.
Feb 27 16:55:42 compute-0 systemd-rc-local-generator[69353]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:55:42 compute-0 systemd-sysv-generator[69358]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:55:42 compute-0 systemd[1]: Starting Create netns directory...
Feb 27 16:55:42 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 27 16:55:42 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 27 16:55:42 compute-0 systemd[1]: Finished Create netns directory.
Feb 27 16:55:42 compute-0 sudo[69277]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:43 compute-0 python3.9[69520]: ansible-ansible.builtin.service_facts Invoked
Feb 27 16:55:43 compute-0 network[69537]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 27 16:55:43 compute-0 network[69538]: 'network-scripts' will be removed from distribution in near future.
Feb 27 16:55:43 compute-0 network[69539]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 27 16:55:46 compute-0 sudo[69800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwfjghkqhonofczipedzfssyimmidokt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211346.2077413-201-40831577757320/AnsiballZ_systemd.py'
Feb 27 16:55:46 compute-0 sudo[69800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:46 compute-0 python3.9[69803]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:55:47 compute-0 systemd[1]: Reloading.
Feb 27 16:55:47 compute-0 systemd-sysv-generator[69831]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:55:47 compute-0 systemd-rc-local-generator[69825]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:55:47 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Feb 27 16:55:47 compute-0 iptables.init[69850]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Feb 27 16:55:47 compute-0 iptables.init[69850]: iptables: Flushing firewall rules: [  OK  ]
Feb 27 16:55:47 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Feb 27 16:55:47 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Feb 27 16:55:47 compute-0 sudo[69800]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:48 compute-0 sudo[70044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilpcqvygxqtjxgjaxjfdplbecodjkteq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211347.7749023-201-272797114172869/AnsiballZ_systemd.py'
Feb 27 16:55:48 compute-0 sudo[70044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:48 compute-0 python3.9[70047]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:55:48 compute-0 sudo[70044]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:48 compute-0 sudo[70199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxzbqacmcmtnjnqektbqgsszmhworfin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211348.625367-217-150854688521422/AnsiballZ_systemd.py'
Feb 27 16:55:48 compute-0 sudo[70199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:49 compute-0 python3.9[70202]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:55:49 compute-0 systemd[1]: Reloading.
Feb 27 16:55:49 compute-0 systemd-rc-local-generator[70228]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:55:49 compute-0 systemd-sysv-generator[70231]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:55:49 compute-0 systemd[1]: Starting Netfilter Tables...
Feb 27 16:55:49 compute-0 systemd[1]: Finished Netfilter Tables.
Feb 27 16:55:49 compute-0 sudo[70199]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:50 compute-0 sudo[70399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nasapkncovsfscmdvnavuduyqmipkgmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211349.7652519-225-158536827632658/AnsiballZ_command.py'
Feb 27 16:55:50 compute-0 sudo[70399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:50 compute-0 python3.9[70402]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:55:50 compute-0 sudo[70399]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:51 compute-0 sudo[70553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sndvizdqfiorofcdzyhzqhyqlyeqntdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211350.8162794-239-126368093364256/AnsiballZ_stat.py'
Feb 27 16:55:51 compute-0 sudo[70553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:51 compute-0 python3.9[70556]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:51 compute-0 sudo[70553]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:51 compute-0 sudo[70679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyggcqnlmutyoqemjcvfwvaitgwebqlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211350.8162794-239-126368093364256/AnsiballZ_copy.py'
Feb 27 16:55:51 compute-0 sudo[70679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:51 compute-0 python3.9[70682]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211350.8162794-239-126368093364256/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:52 compute-0 sudo[70679]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:52 compute-0 sudo[70833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffuybszvxhthbmbsbquxiyeiwjdkkkwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211352.1943724-254-121338900471262/AnsiballZ_systemd.py'
Feb 27 16:55:52 compute-0 sudo[70833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:52 compute-0 python3.9[70836]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 16:55:52 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Feb 27 16:55:52 compute-0 sshd[1013]: Received SIGHUP; restarting.
Feb 27 16:55:52 compute-0 sshd[1013]: Server listening on 0.0.0.0 port 22.
Feb 27 16:55:52 compute-0 sshd[1013]: Server listening on :: port 22.
Feb 27 16:55:52 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Feb 27 16:55:52 compute-0 sudo[70833]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:53 compute-0 sudo[70990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-temzcssxixirsdbkcgokldrllfscoglw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211353.0248601-262-127552327990715/AnsiballZ_file.py'
Feb 27 16:55:53 compute-0 sudo[70990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:53 compute-0 python3.9[70993]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:53 compute-0 sudo[70990]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:53 compute-0 sudo[71143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uecgypgdrblzdajjmlcuesmysvminrel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211353.6587694-270-194878843814173/AnsiballZ_stat.py'
Feb 27 16:55:53 compute-0 sudo[71143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:54 compute-0 python3.9[71146]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:54 compute-0 sudo[71143]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:54 compute-0 sudo[71267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwztpuvuhqmgrymomjetowdvzguektbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211353.6587694-270-194878843814173/AnsiballZ_copy.py'
Feb 27 16:55:54 compute-0 sudo[71267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:54 compute-0 python3.9[71270]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211353.6587694-270-194878843814173/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:54 compute-0 sudo[71267]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:55 compute-0 sudo[71420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fimsctsamqmgvlhbsslwzfnukcevkpif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211354.9678068-288-105479291339911/AnsiballZ_timezone.py'
Feb 27 16:55:55 compute-0 sudo[71420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:55 compute-0 python3.9[71423]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 27 16:55:55 compute-0 systemd[1]: Starting Time & Date Service...
Feb 27 16:55:55 compute-0 systemd[1]: Started Time & Date Service.
Feb 27 16:55:55 compute-0 sudo[71420]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:56 compute-0 sudo[71577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjbbqnlnafouclvtbslnkcfjocgcvpud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211356.024406-297-64320228855919/AnsiballZ_file.py'
Feb 27 16:55:56 compute-0 sudo[71577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:56 compute-0 python3.9[71580]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:56 compute-0 sudo[71577]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:57 compute-0 sudo[71730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsnlovihhbldpfvvfmezdaxfkkanfvaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211356.7157483-305-219591904761412/AnsiballZ_stat.py'
Feb 27 16:55:57 compute-0 sudo[71730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:57 compute-0 python3.9[71733]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:57 compute-0 sudo[71730]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:57 compute-0 sudo[71854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxazydbdrejecsydhkkozavxumopujlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211356.7157483-305-219591904761412/AnsiballZ_copy.py'
Feb 27 16:55:57 compute-0 sudo[71854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:57 compute-0 python3.9[71857]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211356.7157483-305-219591904761412/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:57 compute-0 sudo[71854]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:58 compute-0 sudo[72007]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnauztsbamjfgnulvivqvaigvzniafpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211358.000896-320-70330481461142/AnsiballZ_stat.py'
Feb 27 16:55:58 compute-0 sudo[72007]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:58 compute-0 python3.9[72010]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:58 compute-0 sudo[72007]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:58 compute-0 sudo[72131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syzpwmeyfwwmdtabnwuryxohretcrxhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211358.000896-320-70330481461142/AnsiballZ_copy.py'
Feb 27 16:55:58 compute-0 sudo[72131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:59 compute-0 python3.9[72134]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211358.000896-320-70330481461142/.source.yaml _original_basename=._a1npgdj follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:55:59 compute-0 sudo[72131]: pam_unix(sudo:session): session closed for user root
Feb 27 16:55:59 compute-0 sudo[72284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kepbwtzbpnarnifxwtedceekvhztlimu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211359.269733-335-253695960832395/AnsiballZ_stat.py'
Feb 27 16:55:59 compute-0 sudo[72284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:55:59 compute-0 python3.9[72287]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:55:59 compute-0 sudo[72284]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:00 compute-0 sudo[72408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xordszhtyfltulfcxtzlbnucfqmxjzuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211359.269733-335-253695960832395/AnsiballZ_copy.py'
Feb 27 16:56:00 compute-0 sudo[72408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:00 compute-0 python3.9[72411]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211359.269733-335-253695960832395/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:00 compute-0 sudo[72408]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:00 compute-0 sudo[72561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xixlksctultmriavhrympskaffotnwgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211360.5661757-350-219328430060889/AnsiballZ_command.py'
Feb 27 16:56:00 compute-0 sudo[72561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:00 compute-0 python3.9[72564]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:56:01 compute-0 sudo[72561]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:01 compute-0 sudo[72715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nchvggubqyjefinrbzyyrxpnznkhzoqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211361.1786964-358-18778775923972/AnsiballZ_command.py'
Feb 27 16:56:01 compute-0 sudo[72715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:01 compute-0 python3.9[72718]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:56:01 compute-0 sudo[72715]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:02 compute-0 sudo[72869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdmpyfchgogwqaalaxszxjsffqjovowa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772211361.8947613-366-266207059611538/AnsiballZ_edpm_nftables_from_files.py'
Feb 27 16:56:02 compute-0 sudo[72869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:02 compute-0 python3[72872]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 27 16:56:02 compute-0 sudo[72869]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:03 compute-0 sudo[73022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwqnugkxehgoucqoulmyihbsozlsprsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211362.7341177-374-199239789401132/AnsiballZ_stat.py'
Feb 27 16:56:03 compute-0 sudo[73022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:03 compute-0 python3.9[73025]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:56:03 compute-0 sudo[73022]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:03 compute-0 sudo[73146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxiyycgrxokyiqqrlgpbwargynafkird ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211362.7341177-374-199239789401132/AnsiballZ_copy.py'
Feb 27 16:56:03 compute-0 sudo[73146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:03 compute-0 python3.9[73149]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211362.7341177-374-199239789401132/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:03 compute-0 sudo[73146]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:04 compute-0 sudo[73299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeugzpvmaaljuwxdabrqjiuoqyacggkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211364.0166285-389-83224197581192/AnsiballZ_stat.py'
Feb 27 16:56:04 compute-0 sudo[73299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:04 compute-0 python3.9[73302]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:56:04 compute-0 sudo[73299]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:04 compute-0 sudo[73423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlztrwkocjdfrojvfcpqbhodqmcgtxpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211364.0166285-389-83224197581192/AnsiballZ_copy.py'
Feb 27 16:56:04 compute-0 sudo[73423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:05 compute-0 python3.9[73426]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211364.0166285-389-83224197581192/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:05 compute-0 sudo[73423]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:05 compute-0 sudo[73576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rogfdrekbqlojgjltvxuqrrivytkkoqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211365.3031633-404-47953595300567/AnsiballZ_stat.py'
Feb 27 16:56:05 compute-0 sudo[73576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:05 compute-0 python3.9[73579]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:56:05 compute-0 sudo[73576]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:06 compute-0 sudo[73700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dklzxdorawekzykjyejcllddrjirmulq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211365.3031633-404-47953595300567/AnsiballZ_copy.py'
Feb 27 16:56:06 compute-0 sudo[73700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:06 compute-0 python3.9[73703]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211365.3031633-404-47953595300567/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:06 compute-0 sudo[73700]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:06 compute-0 sudo[73853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aczyqtlyormsmwahlzyqmtbaontziext ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211366.4584928-419-10228845161882/AnsiballZ_stat.py'
Feb 27 16:56:06 compute-0 sudo[73853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:06 compute-0 python3.9[73856]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:56:06 compute-0 sudo[73853]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:07 compute-0 sudo[73977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gendxbjtfwqwjxcpfipzxzxurrxciqmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211366.4584928-419-10228845161882/AnsiballZ_copy.py'
Feb 27 16:56:07 compute-0 sudo[73977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:07 compute-0 python3.9[73980]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211366.4584928-419-10228845161882/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:07 compute-0 sudo[73977]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:08 compute-0 sudo[74130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tougydrkmquatiggpdjocgnxigbvubca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211367.7345676-434-67856491326101/AnsiballZ_stat.py'
Feb 27 16:56:08 compute-0 sudo[74130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:08 compute-0 python3.9[74133]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:56:08 compute-0 sudo[74130]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:08 compute-0 sudo[74254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfxegzysjjunhcntnfqktkpfruwxlxxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211367.7345676-434-67856491326101/AnsiballZ_copy.py'
Feb 27 16:56:08 compute-0 sudo[74254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:08 compute-0 python3.9[74257]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211367.7345676-434-67856491326101/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:08 compute-0 sudo[74254]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:09 compute-0 sudo[74407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvzsmfvafphxqqwlnzjcggkalblhnwjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211368.995172-449-55960330220590/AnsiballZ_file.py'
Feb 27 16:56:09 compute-0 sudo[74407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:09 compute-0 python3.9[74410]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:09 compute-0 sudo[74407]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:09 compute-0 sudo[74560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nihwdfpvldahysvbxpxzpfczuirrglli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211369.6570063-457-151068176454972/AnsiballZ_command.py'
Feb 27 16:56:09 compute-0 sudo[74560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:10 compute-0 python3.9[74563]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:56:10 compute-0 sudo[74560]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:11 compute-0 sudo[74720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exdbhuwsatnxgwluzdfrbafrwtrozfqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211370.4533627-465-248332924208556/AnsiballZ_blockinfile.py'
Feb 27 16:56:11 compute-0 sudo[74720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:11 compute-0 python3.9[74723]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:11 compute-0 sudo[74720]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:11 compute-0 sudo[74874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfrgyqmgtznwtotiipvfysobcshiriqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211371.5211804-474-89153049118143/AnsiballZ_file.py'
Feb 27 16:56:11 compute-0 sudo[74874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:11 compute-0 python3.9[74877]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:12 compute-0 sudo[74874]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:12 compute-0 sudo[75027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukruuphlpxrssrqybpiucnxzpyvbukpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211372.1635103-474-198352518073577/AnsiballZ_file.py'
Feb 27 16:56:12 compute-0 sudo[75027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:12 compute-0 python3.9[75030]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:12 compute-0 sudo[75027]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:13 compute-0 sudo[75180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxftazmqcokuliarplvmvfpprzculqsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211373.0145555-489-203889208684570/AnsiballZ_mount.py'
Feb 27 16:56:13 compute-0 sudo[75180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:13 compute-0 python3.9[75183]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 27 16:56:13 compute-0 sudo[75180]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:13 compute-0 rsyslogd[1012]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 27 16:56:14 compute-0 sudo[75335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uptvclctarsmrjlnbehfvguqihkzoexr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211373.9128232-489-150854171532985/AnsiballZ_mount.py'
Feb 27 16:56:14 compute-0 sudo[75335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:14 compute-0 python3.9[75338]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 27 16:56:14 compute-0 sudo[75335]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:14 compute-0 sshd-session[66077]: Connection closed by 192.168.122.30 port 40606
Feb 27 16:56:14 compute-0 sshd-session[66074]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:56:14 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Feb 27 16:56:14 compute-0 systemd[1]: session-14.scope: Consumed 33.164s CPU time.
Feb 27 16:56:14 compute-0 systemd-logind[803]: Session 14 logged out. Waiting for processes to exit.
Feb 27 16:56:14 compute-0 systemd-logind[803]: Removed session 14.
Feb 27 16:56:19 compute-0 sshd-session[75364]: Accepted publickey for zuul from 192.168.122.30 port 57058 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:56:19 compute-0 systemd-logind[803]: New session 15 of user zuul.
Feb 27 16:56:19 compute-0 systemd[1]: Started Session 15 of User zuul.
Feb 27 16:56:19 compute-0 sshd-session[75364]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:56:20 compute-0 sudo[75517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cogqmqayoflvwmlgthpyfklvlmdbtglp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211379.88942-16-53809720624687/AnsiballZ_tempfile.py'
Feb 27 16:56:20 compute-0 sudo[75517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:20 compute-0 python3.9[75520]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 27 16:56:20 compute-0 sudo[75517]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:21 compute-0 sudo[75670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeduwoxfefpjujryxoxvpbprueplhsno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211380.7721193-28-109108296300547/AnsiballZ_stat.py'
Feb 27 16:56:21 compute-0 sudo[75670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:21 compute-0 python3.9[75673]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:56:21 compute-0 sudo[75670]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:22 compute-0 sudo[75823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-weprfljyryxbfydsnudkntjxbvzjdjul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211381.649394-38-175378199498054/AnsiballZ_setup.py'
Feb 27 16:56:22 compute-0 sudo[75823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:22 compute-0 python3.9[75826]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:56:22 compute-0 sudo[75823]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:23 compute-0 sudo[75976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqagxyrneiwiswlejpfrrfwfagdfahxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211382.7489417-47-53841272623205/AnsiballZ_blockinfile.py'
Feb 27 16:56:23 compute-0 sudo[75976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:23 compute-0 python3.9[75979]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDRbcWZlCUA8a2pUWFuTAGDN9tQLKerHmMmHJQlcNPtwCisTzfHOLnKoU/c6iIO45zyKxHM4RHyTUAoCtRF4apxwWPFx2LonmLqGnbpt5U7cElfg2sQFkMrIQhvee1QYEQnbljuSCPSkF/IHBTVtvbtTtM9/EFfW4gDBNlTVGUpffdhxzD0R0XYY885ygJYKy/m6gxADFSh1CIdgY7bPb+pMOD2zVGWYmjjZo94e1f4uwwBbmEogMZIC7ZFk4vBXgrIwN8K2w0jdXrJRRa2ydw7mZyRFBrYd7Cmil/A+JFR1go4J28vi+EesrjiLyjZl/LCm5s87M7soBN+7cy+zlLbGddFEQgzqrHN+D+Mp5gwXsfdEOjyof+VsiLzvq2JRWxE+kxi0PlbkJ+15rLF6YmuxOdUk8/RdUT29eeNujHVJWnlsJZHcsDvtl/OnrJ5nnnxCHwiw199qRJKbPOdt48Qwexudt1Eu0sHvG95CtoF7j7B8ErgK+FgS4/j6pnXLc0=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINvsDz/Jbmr9b+dgOZ15l906HH8vd5u5/vv5q5uQChEI
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIVRkSGAhI2oejK1AEA0JAPfV6FkyzF+GION8HxNPvKGEinva0H2P13i5xp/i4FDaX4weYN66YorOshccZLx5nw=
                                             create=True mode=0644 path=/tmp/ansible.bg0ovgh3 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:23 compute-0 sudo[75976]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:23 compute-0 sudo[76129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulhqmwcjgwxaiohqwfqrgghncgqogzrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211383.5361369-55-19788362588183/AnsiballZ_command.py'
Feb 27 16:56:23 compute-0 sudo[76129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:24 compute-0 python3.9[76132]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.bg0ovgh3' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:56:24 compute-0 sudo[76129]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:24 compute-0 sudo[76284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-maruivjliwavyucnqdgfzomxnwiemnom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211384.3564541-63-218796228090955/AnsiballZ_file.py'
Feb 27 16:56:24 compute-0 sudo[76284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:24 compute-0 python3.9[76287]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.bg0ovgh3 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:24 compute-0 sudo[76284]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:25 compute-0 sshd-session[75367]: Connection closed by 192.168.122.30 port 57058
Feb 27 16:56:25 compute-0 sshd-session[75364]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:56:25 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Feb 27 16:56:25 compute-0 systemd[1]: session-15.scope: Consumed 3.087s CPU time.
Feb 27 16:56:25 compute-0 systemd-logind[803]: Session 15 logged out. Waiting for processes to exit.
Feb 27 16:56:25 compute-0 systemd-logind[803]: Removed session 15.
Feb 27 16:56:25 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 27 16:56:30 compute-0 sshd-session[76315]: Accepted publickey for zuul from 192.168.122.30 port 40206 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:56:30 compute-0 systemd-logind[803]: New session 16 of user zuul.
Feb 27 16:56:30 compute-0 systemd[1]: Started Session 16 of User zuul.
Feb 27 16:56:30 compute-0 sshd-session[76315]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:56:31 compute-0 python3.9[76468]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:56:32 compute-0 sudo[76622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmwilpzpooethmmayrzckamkxlwvynkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211392.1832967-27-4262628743339/AnsiballZ_systemd.py'
Feb 27 16:56:32 compute-0 sudo[76622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:33 compute-0 python3.9[76625]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 27 16:56:33 compute-0 sudo[76622]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:33 compute-0 sudo[76777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvmdrkkpdgwqhstnhnxrbaxkozeglbel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211393.3917007-35-269429172144679/AnsiballZ_systemd.py'
Feb 27 16:56:33 compute-0 sudo[76777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:34 compute-0 python3.9[76780]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 16:56:34 compute-0 sudo[76777]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:34 compute-0 sudo[76931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omotrljplsnmeyrxhwjwcieejjdiorsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211394.283747-44-110640070891150/AnsiballZ_command.py'
Feb 27 16:56:34 compute-0 sudo[76931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:35 compute-0 python3.9[76934]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:56:35 compute-0 sudo[76931]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:35 compute-0 sudo[77085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anyzpblmdgnmdqqdpxrhxdpnnpoqzwdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211395.3019016-52-221596241109514/AnsiballZ_stat.py'
Feb 27 16:56:35 compute-0 sudo[77085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:36 compute-0 python3.9[77088]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:56:36 compute-0 sudo[77085]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:36 compute-0 sudo[77240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzlnjxluqmbisytrcotmyewutnvmonqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211396.248374-60-129506956292809/AnsiballZ_command.py'
Feb 27 16:56:36 compute-0 sudo[77240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:36 compute-0 python3.9[77243]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:56:36 compute-0 sudo[77240]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:37 compute-0 sudo[77396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nptmthzgdffyjosmuwpyblshpgldnnrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211397.0202281-68-274590227693678/AnsiballZ_file.py'
Feb 27 16:56:37 compute-0 sudo[77396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:37 compute-0 python3.9[77399]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:56:37 compute-0 sudo[77396]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:38 compute-0 sshd-session[76318]: Connection closed by 192.168.122.30 port 40206
Feb 27 16:56:38 compute-0 sshd-session[76315]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:56:38 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Feb 27 16:56:38 compute-0 systemd[1]: session-16.scope: Consumed 4.438s CPU time.
Feb 27 16:56:38 compute-0 systemd-logind[803]: Session 16 logged out. Waiting for processes to exit.
Feb 27 16:56:38 compute-0 systemd-logind[803]: Removed session 16.
Feb 27 16:56:43 compute-0 sshd-session[77424]: Accepted publickey for zuul from 192.168.122.30 port 43266 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:56:43 compute-0 systemd-logind[803]: New session 17 of user zuul.
Feb 27 16:56:43 compute-0 systemd[1]: Started Session 17 of User zuul.
Feb 27 16:56:43 compute-0 sshd-session[77424]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:56:44 compute-0 python3.9[77577]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:56:45 compute-0 sudo[77731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atjjzvssnbmpbsubsrzsvusvgvpkrjhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211404.8373432-29-253285927504284/AnsiballZ_setup.py'
Feb 27 16:56:45 compute-0 sudo[77731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:45 compute-0 python3.9[77734]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:56:45 compute-0 sudo[77731]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:46 compute-0 sudo[77816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odabebgokhrdsurcrhtgwhljkcxnjlhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211404.8373432-29-253285927504284/AnsiballZ_dnf.py'
Feb 27 16:56:46 compute-0 sudo[77816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:56:46 compute-0 python3.9[77819]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 27 16:56:47 compute-0 sudo[77816]: pam_unix(sudo:session): session closed for user root
Feb 27 16:56:48 compute-0 python3.9[77970]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:56:50 compute-0 python3.9[78121]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 27 16:56:51 compute-0 python3.9[78271]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:56:51 compute-0 python3.9[78421]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/nova follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:56:52 compute-0 sshd-session[77427]: Connection closed by 192.168.122.30 port 43266
Feb 27 16:56:52 compute-0 sshd-session[77424]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:56:52 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Feb 27 16:56:52 compute-0 systemd[1]: session-17.scope: Consumed 5.568s CPU time.
Feb 27 16:56:52 compute-0 systemd-logind[803]: Session 17 logged out. Waiting for processes to exit.
Feb 27 16:56:52 compute-0 systemd-logind[803]: Removed session 17.
Feb 27 16:56:57 compute-0 sshd-session[78446]: Accepted publickey for zuul from 192.168.122.30 port 37512 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:56:57 compute-0 systemd-logind[803]: New session 18 of user zuul.
Feb 27 16:56:57 compute-0 systemd[1]: Started Session 18 of User zuul.
Feb 27 16:56:57 compute-0 sshd-session[78446]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:56:58 compute-0 python3.9[78599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:57:00 compute-0 sudo[78753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwthuxglcwivbkqplvojkclbzsdoyrtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211419.8941374-45-225810509971770/AnsiballZ_file.py'
Feb 27 16:57:00 compute-0 sudo[78753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:00 compute-0 python3.9[78756]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:00 compute-0 sudo[78753]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:00 compute-0 sudo[78906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhoywfqurpgsnkdyrxjyodjudpjywbdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211420.639525-45-72353283188610/AnsiballZ_file.py'
Feb 27 16:57:00 compute-0 sudo[78906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:01 compute-0 python3.9[78909]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:01 compute-0 sudo[78906]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:01 compute-0 sudo[79059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-getdmuigvgkuekheoratonlvmdjdaglv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211421.4136865-60-92178764270013/AnsiballZ_stat.py'
Feb 27 16:57:01 compute-0 sudo[79059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:02 compute-0 python3.9[79062]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:02 compute-0 sudo[79059]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:02 compute-0 sudo[79183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecyiugytyqmogofuxxeqbebzizsjgojs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211421.4136865-60-92178764270013/AnsiballZ_copy.py'
Feb 27 16:57:02 compute-0 sudo[79183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:02 compute-0 python3.9[79186]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211421.4136865-60-92178764270013/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=821d480553c551f15ba85387204f0feb07401b52 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:02 compute-0 sudo[79183]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:03 compute-0 sudo[79336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npihpeifxjkzdmqewproyktkrugcmfpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211422.938009-60-145880514684614/AnsiballZ_stat.py'
Feb 27 16:57:03 compute-0 sudo[79336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:03 compute-0 python3.9[79339]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:03 compute-0 sudo[79336]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:03 compute-0 sudo[79460]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpdyxrvdyajplfyhmlrshcdltqtsljmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211422.938009-60-145880514684614/AnsiballZ_copy.py'
Feb 27 16:57:03 compute-0 sudo[79460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:04 compute-0 python3.9[79463]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211422.938009-60-145880514684614/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f454fa4d936d44579f01e7baf3cfbc6c5a763865 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:04 compute-0 sudo[79460]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:04 compute-0 sudo[79613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybaiupbpgyuvjltpfcycernftuotqcrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211424.1838925-60-215055493864900/AnsiballZ_stat.py'
Feb 27 16:57:04 compute-0 sudo[79613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:04 compute-0 python3.9[79616]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:04 compute-0 sudo[79613]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:05 compute-0 sudo[79737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbepebtkohusololnuqyvkvpylvggtrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211424.1838925-60-215055493864900/AnsiballZ_copy.py'
Feb 27 16:57:05 compute-0 sudo[79737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:05 compute-0 python3.9[79740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211424.1838925-60-215055493864900/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=f13ed9e6a6381e001a05d671b42a2c54b9957b5b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:05 compute-0 sudo[79737]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:05 compute-0 sudo[79890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdmchfmjkfvceqljghchlwowvudcpsau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211425.4923375-104-278947520732322/AnsiballZ_file.py'
Feb 27 16:57:05 compute-0 sudo[79890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:06 compute-0 python3.9[79893]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:06 compute-0 sudo[79890]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:06 compute-0 sudo[80043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrqaylwimbafybrgiadgjsrvdykmcmbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211426.2390282-104-70662704408721/AnsiballZ_file.py'
Feb 27 16:57:06 compute-0 sudo[80043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:06 compute-0 python3.9[80046]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:06 compute-0 sudo[80043]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:07 compute-0 sudo[80196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmztmiicufuizjqwwlnmlvyxjfkquknl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211426.9861047-119-227948751679766/AnsiballZ_stat.py'
Feb 27 16:57:07 compute-0 sudo[80196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:07 compute-0 python3.9[80199]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:07 compute-0 sudo[80196]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:07 compute-0 sudo[80320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybkblcahsixbgfaxevovpzkerlnbvule ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211426.9861047-119-227948751679766/AnsiballZ_copy.py'
Feb 27 16:57:07 compute-0 sudo[80320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:08 compute-0 python3.9[80323]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211426.9861047-119-227948751679766/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=bc2ec6d8d69faf44230a2a06a2a682c757307019 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:08 compute-0 sudo[80320]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:08 compute-0 sudo[80473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dskwuxhhucfgewrddtawgfspxwcwuabn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211428.150988-119-38406811810940/AnsiballZ_stat.py'
Feb 27 16:57:08 compute-0 sudo[80473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:08 compute-0 python3.9[80476]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:08 compute-0 sudo[80473]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:08 compute-0 sudo[80597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djssgxmmflrqnhovmjmjolvbyoaytpfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211428.150988-119-38406811810940/AnsiballZ_copy.py'
Feb 27 16:57:08 compute-0 sudo[80597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:09 compute-0 python3.9[80600]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211428.150988-119-38406811810940/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=71e931f9fda94b7b34aa8461bf8aad241f354b81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:09 compute-0 sudo[80597]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:09 compute-0 sudo[80750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwqafccwyusthilqhyiffegsmqhsmrex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211429.3206189-119-194068726625104/AnsiballZ_stat.py'
Feb 27 16:57:09 compute-0 sudo[80750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:09 compute-0 python3.9[80753]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:09 compute-0 sudo[80750]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:10 compute-0 sudo[80874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylfgmqgzwaxhkecxzhsmebkbxjxubgtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211429.3206189-119-194068726625104/AnsiballZ_copy.py'
Feb 27 16:57:10 compute-0 sudo[80874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:10 compute-0 python3.9[80877]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211429.3206189-119-194068726625104/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=03a761e426e35146726d81e91677c5b9401d40e1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:10 compute-0 sudo[80874]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:10 compute-0 sudo[81027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnihornxpgssuukrlhnnfpfderqfolcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211430.6235027-163-25272659120360/AnsiballZ_file.py'
Feb 27 16:57:10 compute-0 sudo[81027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:11 compute-0 python3.9[81030]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:11 compute-0 sudo[81027]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:11 compute-0 sudo[81180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwzogrnponmmdgvybdcahjuceqiungdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211431.2051156-163-75825941690950/AnsiballZ_file.py'
Feb 27 16:57:11 compute-0 sudo[81180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:11 compute-0 python3.9[81183]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:11 compute-0 sudo[81180]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:12 compute-0 sudo[81333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkfviivftygtlbjlynzgdmfnhknfmktd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211431.923421-178-239051203620987/AnsiballZ_stat.py'
Feb 27 16:57:12 compute-0 sudo[81333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:12 compute-0 python3.9[81336]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:12 compute-0 sudo[81333]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:12 compute-0 sudo[81457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asfgmharcuvjukzulogszarurdrbmnen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211431.923421-178-239051203620987/AnsiballZ_copy.py'
Feb 27 16:57:12 compute-0 sudo[81457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:13 compute-0 python3.9[81460]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211431.923421-178-239051203620987/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=a8ffc88fa364f3691430c6b36c7c0565ecda2138 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:13 compute-0 sudo[81457]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:13 compute-0 sudo[81610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzyzxphglltezozpafpuqrtcnzhtdmrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211433.185441-178-264512279057901/AnsiballZ_stat.py'
Feb 27 16:57:13 compute-0 sudo[81610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:13 compute-0 python3.9[81613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:13 compute-0 sudo[81610]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:14 compute-0 sudo[81734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncwwxjpdaarltoervacmiuiquhesiezl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211433.185441-178-264512279057901/AnsiballZ_copy.py'
Feb 27 16:57:14 compute-0 sudo[81734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:14 compute-0 python3.9[81737]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211433.185441-178-264512279057901/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=4bb341ba88eb41114118c347f56b5421f9903364 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:14 compute-0 sudo[81734]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:14 compute-0 sudo[81887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxezlxiuqishaxzymhnqgvjoqzsbojox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211434.695403-178-255510207925954/AnsiballZ_stat.py'
Feb 27 16:57:15 compute-0 sudo[81887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:15 compute-0 python3.9[81890]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:15 compute-0 sudo[81887]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:15 compute-0 sudo[82011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkjsmogykjlaazdifgjtnjucofflwphr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211434.695403-178-255510207925954/AnsiballZ_copy.py'
Feb 27 16:57:15 compute-0 sudo[82011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:15 compute-0 python3.9[82014]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211434.695403-178-255510207925954/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=a76998de115d71ada0ec106126fe97a7156f4199 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:15 compute-0 sudo[82011]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:16 compute-0 sudo[82164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbckdfdyplauqvpajedcxrseiydemnku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211435.9924147-222-155399507202941/AnsiballZ_file.py'
Feb 27 16:57:16 compute-0 sudo[82164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:16 compute-0 python3.9[82167]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:16 compute-0 sudo[82164]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:16 compute-0 sudo[82317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-matekzltiwxyleqrhuohlwmdqlrosazh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211436.6662714-222-190563213079464/AnsiballZ_file.py'
Feb 27 16:57:16 compute-0 sudo[82317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:17 compute-0 python3.9[82320]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:17 compute-0 sudo[82317]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:17 compute-0 sudo[82470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxhpvocikbddjwlzkixuglxkucvijhmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211437.350767-237-74531658427322/AnsiballZ_stat.py'
Feb 27 16:57:17 compute-0 sudo[82470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:17 compute-0 python3.9[82473]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:17 compute-0 sudo[82470]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:18 compute-0 sudo[82594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouxuxkutwpizqpjflgwlptebdgamdsyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211437.350767-237-74531658427322/AnsiballZ_copy.py'
Feb 27 16:57:18 compute-0 sudo[82594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:18 compute-0 python3.9[82597]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211437.350767-237-74531658427322/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=3e5c8fcce0b5710f57fda0a7bed9e24d08e8fe76 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:18 compute-0 sudo[82594]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:18 compute-0 sudo[82747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbiarwarabasqjxcqrxoyvftgllkubik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211438.547319-237-185636805634116/AnsiballZ_stat.py'
Feb 27 16:57:18 compute-0 sudo[82747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:18 compute-0 python3.9[82750]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:19 compute-0 sudo[82747]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:19 compute-0 sudo[82871]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzwwswlvrshkdqnwupygrvsoojjagujg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211438.547319-237-185636805634116/AnsiballZ_copy.py'
Feb 27 16:57:19 compute-0 sudo[82871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:19 compute-0 python3.9[82874]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211438.547319-237-185636805634116/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=4bb341ba88eb41114118c347f56b5421f9903364 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:19 compute-0 sudo[82871]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:20 compute-0 sudo[83024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzyqhafxluogvacoxelzvbwhqgtdsfmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211439.6614256-237-38389773456379/AnsiballZ_stat.py'
Feb 27 16:57:20 compute-0 sudo[83024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:20 compute-0 python3.9[83027]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:20 compute-0 sudo[83024]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:21 compute-0 sudo[83148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhajfnyiyfkldbyaegpbjuuhfhfliohr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211439.6614256-237-38389773456379/AnsiballZ_copy.py'
Feb 27 16:57:21 compute-0 sudo[83148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:21 compute-0 python3.9[83151]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211439.6614256-237-38389773456379/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=7de71c58147eb9f3a33c757632edce45f1bc3a17 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:21 compute-0 sudo[83148]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:22 compute-0 sudo[83301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwxkttnfrlwjyvapvyxunkmzvuotlzgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211442.1911116-297-11600587211319/AnsiballZ_file.py'
Feb 27 16:57:22 compute-0 sudo[83301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:22 compute-0 python3.9[83304]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:22 compute-0 sudo[83301]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:23 compute-0 sudo[83454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgczquljmxtxisejpmlsxgszorzikusa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211442.912623-305-274130602571021/AnsiballZ_stat.py'
Feb 27 16:57:23 compute-0 sudo[83454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:23 compute-0 python3.9[83457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:23 compute-0 sudo[83454]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:23 compute-0 sudo[83578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wablfxfvyfnaxmsvhzvtgzcvvkqhxyxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211442.912623-305-274130602571021/AnsiballZ_copy.py'
Feb 27 16:57:23 compute-0 sudo[83578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:24 compute-0 python3.9[83581]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211442.912623-305-274130602571021/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0b0dce1c63b93697635da4e15a27b572fef59c62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:24 compute-0 sudo[83578]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:24 compute-0 sudo[83731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdtcwzdxpdsofpsspztidjzvqvwmotrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211444.3095245-321-8361563392596/AnsiballZ_file.py'
Feb 27 16:57:24 compute-0 sudo[83731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:24 compute-0 python3.9[83734]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:24 compute-0 sudo[83731]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:25 compute-0 sudo[83884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjmzxjalhclbkwhridcbjoyrzmxizljq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211445.0667722-329-75755159391416/AnsiballZ_stat.py'
Feb 27 16:57:25 compute-0 sudo[83884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:25 compute-0 python3.9[83887]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:25 compute-0 sudo[83884]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:26 compute-0 sudo[84008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nysauolgpxkjihrlrpmvmoooceewylou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211445.0667722-329-75755159391416/AnsiballZ_copy.py'
Feb 27 16:57:26 compute-0 sudo[84008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:26 compute-0 python3.9[84011]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211445.0667722-329-75755159391416/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0b0dce1c63b93697635da4e15a27b572fef59c62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:26 compute-0 sudo[84008]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:26 compute-0 sudo[84161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spmmpxdofizdhclycbycssskkiygeoob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211446.5239737-345-89749432837943/AnsiballZ_file.py'
Feb 27 16:57:26 compute-0 sudo[84161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:27 compute-0 python3.9[84164]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:27 compute-0 sudo[84161]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:27 compute-0 sudo[84314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxgaxfucfmazcgwhgmmxzrluizlrvruz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211447.2724903-353-13853272759746/AnsiballZ_stat.py'
Feb 27 16:57:27 compute-0 sudo[84314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:28 compute-0 python3.9[84317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:28 compute-0 sudo[84314]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:28 compute-0 sudo[84438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-balowassilpdsgofwoqfstglyuteejob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211447.2724903-353-13853272759746/AnsiballZ_copy.py'
Feb 27 16:57:28 compute-0 sudo[84438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:28 compute-0 python3.9[84441]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211447.2724903-353-13853272759746/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0b0dce1c63b93697635da4e15a27b572fef59c62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:28 compute-0 sudo[84438]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:29 compute-0 sudo[84591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imchaeicvggmgswzllhyivpjfukrizzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211448.9426627-369-37689670780415/AnsiballZ_file.py'
Feb 27 16:57:29 compute-0 sudo[84591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:29 compute-0 python3.9[84594]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:29 compute-0 sudo[84591]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:29 compute-0 chronyd[66048]: Selected source 23.133.168.245 (pool.ntp.org)
Feb 27 16:57:29 compute-0 sudo[84744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwcwkqwxfysntrlqurzpmprumoabxmhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211449.637714-377-13074791819885/AnsiballZ_stat.py'
Feb 27 16:57:29 compute-0 sudo[84744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:30 compute-0 python3.9[84747]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:30 compute-0 sudo[84744]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:30 compute-0 sudo[84868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjvawxuttceqickvwpwpwtgzuqxlifeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211449.637714-377-13074791819885/AnsiballZ_copy.py'
Feb 27 16:57:30 compute-0 sudo[84868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:30 compute-0 python3.9[84871]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211449.637714-377-13074791819885/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0b0dce1c63b93697635da4e15a27b572fef59c62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:30 compute-0 sudo[84868]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:31 compute-0 sudo[85021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icjpohsjjuchbktwrknymnzrhkkewwtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211451.056337-393-89788745491889/AnsiballZ_file.py'
Feb 27 16:57:31 compute-0 sudo[85021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:31 compute-0 python3.9[85024]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:31 compute-0 sudo[85021]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:32 compute-0 sudo[85174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epjioegiyeypgizaprplgvjpimgvsljn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211451.9161844-401-211867765976741/AnsiballZ_stat.py'
Feb 27 16:57:32 compute-0 sudo[85174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:32 compute-0 python3.9[85177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:32 compute-0 sudo[85174]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:32 compute-0 sudo[85298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhtrufkgsshvfqtrbkdmlmyepgjlqmrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211451.9161844-401-211867765976741/AnsiballZ_copy.py'
Feb 27 16:57:32 compute-0 sudo[85298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:33 compute-0 python3.9[85301]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211451.9161844-401-211867765976741/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0b0dce1c63b93697635da4e15a27b572fef59c62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:33 compute-0 sudo[85298]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:33 compute-0 sudo[85451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehfmeifcdahhntumncwzzmglukktvyxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211453.4565644-417-188493311340803/AnsiballZ_file.py'
Feb 27 16:57:33 compute-0 sudo[85451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:34 compute-0 python3.9[85454]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:34 compute-0 sudo[85451]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:34 compute-0 sudo[85604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xunxiywzdluqnrlrczhoafosxhsltzzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211454.2667525-425-263818349520502/AnsiballZ_stat.py'
Feb 27 16:57:34 compute-0 sudo[85604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:34 compute-0 python3.9[85607]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:34 compute-0 sudo[85604]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:35 compute-0 sudo[85728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmvaoogkhrpytnooylgrcktokdzxhsqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211454.2667525-425-263818349520502/AnsiballZ_copy.py'
Feb 27 16:57:35 compute-0 sudo[85728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:35 compute-0 python3.9[85731]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211454.2667525-425-263818349520502/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0b0dce1c63b93697635da4e15a27b572fef59c62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:35 compute-0 sudo[85728]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:35 compute-0 sudo[85881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-josomlktzwhmmqvqmwbngboowbskotcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211455.633792-441-95185003169572/AnsiballZ_file.py'
Feb 27 16:57:35 compute-0 sudo[85881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:36 compute-0 python3.9[85884]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:36 compute-0 sudo[85881]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:36 compute-0 sudo[86034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzusbufpxwehtlscinwhlmwgiqxnjyos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211456.3885465-449-44817780650905/AnsiballZ_stat.py'
Feb 27 16:57:36 compute-0 sudo[86034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:37 compute-0 python3.9[86037]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:37 compute-0 sudo[86034]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:37 compute-0 sudo[86158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfvhszonfbizglnqlqxzifgjwihfauxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211456.3885465-449-44817780650905/AnsiballZ_copy.py'
Feb 27 16:57:37 compute-0 sudo[86158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:38 compute-0 python3.9[86161]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211456.3885465-449-44817780650905/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=0b0dce1c63b93697635da4e15a27b572fef59c62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:38 compute-0 sudo[86158]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:38 compute-0 sshd-session[78449]: Connection closed by 192.168.122.30 port 37512
Feb 27 16:57:38 compute-0 sshd-session[78446]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:57:38 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Feb 27 16:57:38 compute-0 systemd[1]: session-18.scope: Consumed 26.957s CPU time.
Feb 27 16:57:38 compute-0 systemd-logind[803]: Session 18 logged out. Waiting for processes to exit.
Feb 27 16:57:38 compute-0 systemd-logind[803]: Removed session 18.
Feb 27 16:57:44 compute-0 sshd-session[86186]: Accepted publickey for zuul from 192.168.122.30 port 47122 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:57:44 compute-0 systemd-logind[803]: New session 19 of user zuul.
Feb 27 16:57:44 compute-0 systemd[1]: Started Session 19 of User zuul.
Feb 27 16:57:44 compute-0 sshd-session[86186]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:57:45 compute-0 python3.9[86339]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:57:46 compute-0 sudo[86493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvqkboenurjgxmolxraadurqacoeluvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211466.1904368-29-115857916495412/AnsiballZ_file.py'
Feb 27 16:57:46 compute-0 sudo[86493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:46 compute-0 python3.9[86496]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:46 compute-0 sudo[86493]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:47 compute-0 sudo[86646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxclngyxxdzrodsttbjfhfdyerpthlcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211466.944764-29-207968616438674/AnsiballZ_file.py'
Feb 27 16:57:47 compute-0 sudo[86646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:47 compute-0 python3.9[86649]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:57:47 compute-0 sudo[86646]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:48 compute-0 python3.9[86799]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:57:48 compute-0 sudo[86949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuuedyrvkkfslswueghrqdaeervjnrrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211468.356122-52-213354398557266/AnsiballZ_seboolean.py'
Feb 27 16:57:48 compute-0 sudo[86949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:49 compute-0 python3.9[86952]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 27 16:57:50 compute-0 sudo[86949]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:51 compute-0 sudo[87106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlavsmqridwbiopdjcinknbmyaclglrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211470.740174-62-260456912460477/AnsiballZ_setup.py'
Feb 27 16:57:51 compute-0 dbus-broker-launch[787]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 27 16:57:51 compute-0 sudo[87106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:51 compute-0 python3.9[87109]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:57:51 compute-0 sudo[87106]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:51 compute-0 sudo[87191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-popymvfgaxreolsvrwzbrxbdzqfwcckm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211470.740174-62-260456912460477/AnsiballZ_dnf.py'
Feb 27 16:57:51 compute-0 sudo[87191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:52 compute-0 python3.9[87194]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:57:53 compute-0 sudo[87191]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:54 compute-0 sudo[87345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gpurejusniykkiswioymnqcikwwsnwdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211473.727193-74-172631942336404/AnsiballZ_systemd.py'
Feb 27 16:57:54 compute-0 sudo[87345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:54 compute-0 python3.9[87348]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 27 16:57:54 compute-0 sudo[87345]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:55 compute-0 sudo[87501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifwdlcxzehwlklemrndamsldwboaqadb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772211474.904394-82-122314688283314/AnsiballZ_edpm_nftables_snippet.py'
Feb 27 16:57:55 compute-0 sudo[87501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:55 compute-0 python3[87504]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 27 16:57:55 compute-0 sudo[87501]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:56 compute-0 sudo[87654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jizsvdrhhaorciflkjgqvxizczvecanq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211475.7941623-91-263089108166712/AnsiballZ_file.py'
Feb 27 16:57:56 compute-0 sudo[87654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:56 compute-0 python3.9[87657]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:56 compute-0 sudo[87654]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:56 compute-0 sudo[87807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbyleucvhyldpoaakjzmqjyhxxuitjoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211476.5925212-99-71229488157757/AnsiballZ_stat.py'
Feb 27 16:57:56 compute-0 sudo[87807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:57 compute-0 python3.9[87810]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:57 compute-0 sudo[87807]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:57 compute-0 sudo[87886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzpjirhruacciqrccamexajownppmfrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211476.5925212-99-71229488157757/AnsiballZ_file.py'
Feb 27 16:57:57 compute-0 sudo[87886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:57 compute-0 python3.9[87889]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:57 compute-0 sudo[87886]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:58 compute-0 sudo[88039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqwaedrojdhskiohuztfpqxrbivszatk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211477.9849737-111-37764008815992/AnsiballZ_stat.py'
Feb 27 16:57:58 compute-0 sudo[88039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:58 compute-0 python3.9[88042]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:58 compute-0 sudo[88039]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:58 compute-0 sudo[88118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtyygjbfnnaykqiallaienoruxseaafk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211477.9849737-111-37764008815992/AnsiballZ_file.py'
Feb 27 16:57:58 compute-0 sudo[88118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:59 compute-0 python3.9[88121]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.zm0kxag_ recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:57:59 compute-0 sudo[88118]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:59 compute-0 sudo[88271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niptxvlzrkkszjeddbfvepagwqaqtmsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211479.2539113-123-54284685716214/AnsiballZ_stat.py'
Feb 27 16:57:59 compute-0 sudo[88271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:57:59 compute-0 python3.9[88274]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:57:59 compute-0 sudo[88271]: pam_unix(sudo:session): session closed for user root
Feb 27 16:57:59 compute-0 sudo[88350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sydbrsykcktvwkblfiehubiiokfcudoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211479.2539113-123-54284685716214/AnsiballZ_file.py'
Feb 27 16:57:59 compute-0 sudo[88350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:00 compute-0 python3.9[88353]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:00 compute-0 sudo[88350]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:00 compute-0 sudo[88503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqsnhegukxnfqkgecicxdffdhnleeqre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211480.4446552-136-110972117469196/AnsiballZ_command.py'
Feb 27 16:58:00 compute-0 sudo[88503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:01 compute-0 python3.9[88506]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:58:01 compute-0 sudo[88503]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:01 compute-0 sudo[88657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vydnxebrdevvvqnjyfgiuiptoiobcnwy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772211481.3698316-144-261008637473149/AnsiballZ_edpm_nftables_from_files.py'
Feb 27 16:58:01 compute-0 sudo[88657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:02 compute-0 python3[88660]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 27 16:58:02 compute-0 sudo[88657]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:02 compute-0 sudo[88810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxtobylypgclaeabodnmhigmydzkivmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211482.2253726-152-224841475059501/AnsiballZ_stat.py'
Feb 27 16:58:02 compute-0 sudo[88810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:02 compute-0 python3.9[88813]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:02 compute-0 sudo[88810]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:03 compute-0 sudo[88936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctiozlbhekyaixdvpyzownghdismlzfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211482.2253726-152-224841475059501/AnsiballZ_copy.py'
Feb 27 16:58:03 compute-0 sudo[88936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:03 compute-0 python3.9[88939]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211482.2253726-152-224841475059501/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:03 compute-0 sudo[88936]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:04 compute-0 sudo[89089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owttpeagscwteavhvaikvimjbezpthtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211483.7644067-167-68364185305505/AnsiballZ_stat.py'
Feb 27 16:58:04 compute-0 sudo[89089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:04 compute-0 python3.9[89092]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:04 compute-0 sudo[89089]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:04 compute-0 sudo[89215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkdlkvvhvpkzifjtedgfhjetcfpqsjci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211483.7644067-167-68364185305505/AnsiballZ_copy.py'
Feb 27 16:58:04 compute-0 sudo[89215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:04 compute-0 python3.9[89218]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211483.7644067-167-68364185305505/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:04 compute-0 sudo[89215]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:05 compute-0 sudo[89368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmzmogbdwbndotivtgzytaszalmbhcyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211485.0578227-182-210034019130902/AnsiballZ_stat.py'
Feb 27 16:58:05 compute-0 sudo[89368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:05 compute-0 python3.9[89371]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:05 compute-0 sudo[89368]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:05 compute-0 sudo[89494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmedrxeitqugdfcszynnzgwgiqompnqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211485.0578227-182-210034019130902/AnsiballZ_copy.py'
Feb 27 16:58:05 compute-0 sudo[89494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:06 compute-0 python3.9[89497]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211485.0578227-182-210034019130902/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:06 compute-0 sudo[89494]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:06 compute-0 sudo[89647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upgjsmsjrsxrgiihbqgojsafxntrhfis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211486.500301-197-142844813680063/AnsiballZ_stat.py'
Feb 27 16:58:06 compute-0 sudo[89647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:07 compute-0 python3.9[89650]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:07 compute-0 sudo[89647]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:07 compute-0 sudo[89773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-remrdakkuldqkrmzbwgvlktaxufaqjjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211486.500301-197-142844813680063/AnsiballZ_copy.py'
Feb 27 16:58:07 compute-0 sudo[89773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:07 compute-0 python3.9[89776]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211486.500301-197-142844813680063/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:07 compute-0 sudo[89773]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:08 compute-0 sudo[89926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsgndrlctouqxmsakzhplyihfiffwymw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211487.804833-212-190807399235501/AnsiballZ_stat.py'
Feb 27 16:58:08 compute-0 sudo[89926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:08 compute-0 python3.9[89929]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:08 compute-0 sudo[89926]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:08 compute-0 sudo[90052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgitbxtaumltkbopvsghkaqtlggmwjwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211487.804833-212-190807399235501/AnsiballZ_copy.py'
Feb 27 16:58:08 compute-0 sudo[90052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:09 compute-0 python3.9[90055]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211487.804833-212-190807399235501/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:09 compute-0 sudo[90052]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:09 compute-0 sudo[90205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yasomfvhckzgkuounmyqbtyudghsitck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211489.184449-227-153439225678927/AnsiballZ_file.py'
Feb 27 16:58:09 compute-0 sudo[90205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:09 compute-0 python3.9[90208]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:09 compute-0 sudo[90205]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:10 compute-0 sudo[90358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adtgbuzqqmshfxsmktoewtuggfztfrhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211489.8256547-235-195753647946678/AnsiballZ_command.py'
Feb 27 16:58:10 compute-0 sudo[90358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:10 compute-0 python3.9[90361]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:58:10 compute-0 sudo[90358]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:11 compute-0 sudo[90514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yczylljndcfxlomdvndovamypfoiwtjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211490.5968606-243-254523309294079/AnsiballZ_blockinfile.py'
Feb 27 16:58:11 compute-0 sudo[90514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:11 compute-0 python3.9[90517]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:11 compute-0 sudo[90514]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:11 compute-0 sudo[90667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohmiofrylsacrazlgwwesaardfigdxlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211491.6370835-252-203058499195408/AnsiballZ_command.py'
Feb 27 16:58:11 compute-0 sudo[90667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:12 compute-0 python3.9[90670]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:58:12 compute-0 sudo[90667]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:12 compute-0 sudo[90821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csmxljeclfvwjgamowuldjoddnnrmgbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211492.3987093-260-153012151136637/AnsiballZ_stat.py'
Feb 27 16:58:12 compute-0 sudo[90821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:12 compute-0 python3.9[90824]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:58:12 compute-0 sudo[90821]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:13 compute-0 sudo[90976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzjrbzcxcykljkmhprhosgkzcwjsoyjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211493.084721-268-138820637740197/AnsiballZ_command.py'
Feb 27 16:58:13 compute-0 sudo[90976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:13 compute-0 python3.9[90979]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:58:13 compute-0 sudo[90976]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:14 compute-0 sudo[91132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-equbsshvgyquwyjgadouygloydhlwlyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211493.8406208-276-181735383944317/AnsiballZ_file.py'
Feb 27 16:58:14 compute-0 sudo[91132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:14 compute-0 python3.9[91135]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:14 compute-0 sudo[91132]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:15 compute-0 python3.9[91285]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:58:16 compute-0 sudo[91436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyizwdmxiukepehsxmwgdzlvsswljhcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211496.2382073-317-164998423209663/AnsiballZ_command.py'
Feb 27 16:58:16 compute-0 sudo[91436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:16 compute-0 python3.9[91439]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:e0:eb:c4:a5" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:58:16 compute-0 ovs-vsctl[91440]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:e0:eb:c4:a5 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 27 16:58:16 compute-0 sudo[91436]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:17 compute-0 sudo[91590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yawthdmiyffbunvusnzvftqrglqrzbwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211496.9278247-326-232522183011846/AnsiballZ_command.py'
Feb 27 16:58:17 compute-0 sudo[91590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:17 compute-0 python3.9[91593]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:58:17 compute-0 sudo[91590]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:17 compute-0 sudo[91746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eutgryyogdmiljhlgporvrozxsaqpubr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211497.6520543-334-154412577483203/AnsiballZ_command.py'
Feb 27 16:58:17 compute-0 sudo[91746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:18 compute-0 python3.9[91749]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:58:18 compute-0 ovs-vsctl[91750]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 27 16:58:18 compute-0 sudo[91746]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:18 compute-0 python3.9[91900]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:58:19 compute-0 sudo[92052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrjiqfsjydcaosdzmlesfzasfipnguhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211499.0847642-351-191872871926516/AnsiballZ_file.py'
Feb 27 16:58:19 compute-0 sudo[92052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:19 compute-0 python3.9[92055]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:58:19 compute-0 sudo[92052]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:20 compute-0 sudo[92205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijsfjsczeyahdigqoogtnkhxwtuqjsdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211499.792682-359-265061549458148/AnsiballZ_stat.py'
Feb 27 16:58:20 compute-0 sudo[92205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:20 compute-0 python3.9[92208]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:20 compute-0 sudo[92205]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:20 compute-0 sudo[92284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqgbwajfvdzlqkvshgiaobemlkzdbpqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211499.792682-359-265061549458148/AnsiballZ_file.py'
Feb 27 16:58:20 compute-0 sudo[92284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:20 compute-0 python3.9[92287]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:58:20 compute-0 sudo[92284]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:21 compute-0 sudo[92437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tavqkqboedwairdchgtzaeewueyyizza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211500.9177487-359-268414052988172/AnsiballZ_stat.py'
Feb 27 16:58:21 compute-0 sudo[92437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:21 compute-0 python3.9[92440]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:21 compute-0 sudo[92437]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:21 compute-0 sudo[92516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbuvtwspglcsclsvubyvwfavexzoemzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211500.9177487-359-268414052988172/AnsiballZ_file.py'
Feb 27 16:58:21 compute-0 sudo[92516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:21 compute-0 python3.9[92519]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:58:21 compute-0 sudo[92516]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:22 compute-0 sudo[92669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtcxtaxdxwsqiazbcpqmeigihpltkvsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211502.211138-382-255744377285721/AnsiballZ_file.py'
Feb 27 16:58:22 compute-0 sudo[92669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:22 compute-0 python3.9[92672]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:22 compute-0 sudo[92669]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:23 compute-0 sudo[92822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgknmjuxsejbgcsxrznkoeursfpwhfde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211502.852338-390-34880694073152/AnsiballZ_stat.py'
Feb 27 16:58:23 compute-0 sudo[92822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:23 compute-0 python3.9[92825]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:23 compute-0 sudo[92822]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:23 compute-0 sudo[92901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smrplixchyjaddhsgepnynuzpxhayslz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211502.852338-390-34880694073152/AnsiballZ_file.py'
Feb 27 16:58:23 compute-0 sudo[92901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:23 compute-0 python3.9[92904]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:23 compute-0 sudo[92901]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:24 compute-0 sudo[93054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvpvcvepmwkmurdwkenkwflnmmbipgve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211504.061574-402-39173596560785/AnsiballZ_stat.py'
Feb 27 16:58:24 compute-0 sudo[93054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:24 compute-0 python3.9[93057]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:24 compute-0 sudo[93054]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:24 compute-0 sudo[93133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udxpjdwimrekopleryjopsvxdgkmtlse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211504.061574-402-39173596560785/AnsiballZ_file.py'
Feb 27 16:58:24 compute-0 sudo[93133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:25 compute-0 python3.9[93136]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:25 compute-0 sudo[93133]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:25 compute-0 sudo[93286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wledchvftlcsvdgzzgonaagpqufylazf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211505.2513382-414-39779888173102/AnsiballZ_systemd.py'
Feb 27 16:58:25 compute-0 sudo[93286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:25 compute-0 python3.9[93289]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:58:25 compute-0 systemd[1]: Reloading.
Feb 27 16:58:26 compute-0 systemd-sysv-generator[93317]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:58:26 compute-0 systemd-rc-local-generator[93313]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:58:26 compute-0 sudo[93286]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:26 compute-0 sudo[93483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilbbrafigaehaoqrbsjmbnholciinbai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211506.4333339-422-53164934726145/AnsiballZ_stat.py'
Feb 27 16:58:26 compute-0 sudo[93483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:26 compute-0 python3.9[93486]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:26 compute-0 sudo[93483]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:27 compute-0 sudo[93562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxjpuzqjrgqiadznkcnzwmzpfgszpjuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211506.4333339-422-53164934726145/AnsiballZ_file.py'
Feb 27 16:58:27 compute-0 sudo[93562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:27 compute-0 python3.9[93565]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:27 compute-0 sudo[93562]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:27 compute-0 sudo[93715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywstavcwhrofbgzepuvxtxirapjbnwpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211507.6388242-434-72095611359345/AnsiballZ_stat.py'
Feb 27 16:58:27 compute-0 sudo[93715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:28 compute-0 python3.9[93718]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:28 compute-0 sudo[93715]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:28 compute-0 sudo[93794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vikhaloicyprqzlpriislkhpfoihivjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211507.6388242-434-72095611359345/AnsiballZ_file.py'
Feb 27 16:58:28 compute-0 sudo[93794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:28 compute-0 python3.9[93797]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:28 compute-0 sudo[93794]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:29 compute-0 sudo[93947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxapnvtuzkzjwqvtiytfzfpffuuigjth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211508.907207-446-12139746805881/AnsiballZ_systemd.py'
Feb 27 16:58:29 compute-0 sudo[93947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:29 compute-0 python3.9[93950]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:58:29 compute-0 systemd[1]: Reloading.
Feb 27 16:58:29 compute-0 systemd-rc-local-generator[93974]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:58:29 compute-0 systemd-sysv-generator[93977]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:58:29 compute-0 systemd[1]: Starting Create netns directory...
Feb 27 16:58:29 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 27 16:58:29 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 27 16:58:29 compute-0 systemd[1]: Finished Create netns directory.
Feb 27 16:58:29 compute-0 sudo[93947]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:30 compute-0 sudo[94148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xesolxttiodauoayynbmamhwdqtgihfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211509.9978282-456-157521300805800/AnsiballZ_file.py'
Feb 27 16:58:30 compute-0 sudo[94148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:30 compute-0 python3.9[94151]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:58:30 compute-0 sudo[94148]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:31 compute-0 sudo[94301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcsorgepvjjzwsvbodbqmrypuwnldlun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211510.74802-464-152203028265007/AnsiballZ_stat.py'
Feb 27 16:58:31 compute-0 sudo[94301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:31 compute-0 python3.9[94304]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:31 compute-0 sudo[94301]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:31 compute-0 sudo[94425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwdohqwjoklkfipqaxxcttgcuwwuryoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211510.74802-464-152203028265007/AnsiballZ_copy.py'
Feb 27 16:58:31 compute-0 sudo[94425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:31 compute-0 python3.9[94428]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211510.74802-464-152203028265007/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:58:31 compute-0 sudo[94425]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:32 compute-0 sudo[94578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcamkkqsveqwvtyeyzpjlbhcpcknvmfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211512.3143532-481-48708523005979/AnsiballZ_file.py'
Feb 27 16:58:32 compute-0 sudo[94578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:32 compute-0 python3.9[94581]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:32 compute-0 sudo[94578]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:33 compute-0 sudo[94731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylbqhjaqxuzkfttdohbouvwxpbvfufoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211512.9949546-489-183478928663576/AnsiballZ_file.py'
Feb 27 16:58:33 compute-0 sudo[94731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:33 compute-0 python3.9[94734]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:58:33 compute-0 sudo[94731]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:34 compute-0 sudo[94884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivqswjswdiexzlpahjelindhpyesechz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211513.7521095-497-226254775264088/AnsiballZ_stat.py'
Feb 27 16:58:34 compute-0 sudo[94884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:34 compute-0 python3.9[94887]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:34 compute-0 sudo[94884]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:34 compute-0 sudo[95008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbqmnxjxddybgjalflgifvcsgjuaxdjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211513.7521095-497-226254775264088/AnsiballZ_copy.py'
Feb 27 16:58:34 compute-0 sudo[95008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:34 compute-0 python3.9[95011]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211513.7521095-497-226254775264088/.source.json _original_basename=.ww81ar5o follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:34 compute-0 sudo[95008]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:35 compute-0 python3.9[95161]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:37 compute-0 sudo[95582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfwngjzptbcfwlqjyqwnoownooxseray ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211517.2230456-537-130524043822735/AnsiballZ_container_config_data.py'
Feb 27 16:58:37 compute-0 sudo[95582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:37 compute-0 python3.9[95585]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 27 16:58:37 compute-0 sudo[95582]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:38 compute-0 sudo[95735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvxqhxdhdjrjubacjltxmpiuqjddlidz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211518.1508317-548-270323249715899/AnsiballZ_container_config_hash.py'
Feb 27 16:58:38 compute-0 sudo[95735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:38 compute-0 python3.9[95738]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 27 16:58:38 compute-0 sudo[95735]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:39 compute-0 sudo[95888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dchjtlpawpwhmbugnbwkeddextgvlkej ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772211519.2276278-558-99944574838054/AnsiballZ_edpm_container_manage.py'
Feb 27 16:58:39 compute-0 sudo[95888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:40 compute-0 python3[95891]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 27 16:58:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:58:40 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:58:41 compute-0 podman[95927]: 2026-02-27 16:58:41.147330514 +0000 UTC m=+0.021650981 image pull ce6781f051bf092c13d84cb587c56ad7edaa58b70fcc0effc1dff15724d5232e quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 27 16:58:41 compute-0 podman[95927]: 2026-02-27 16:58:41.300474201 +0000 UTC m=+0.174794658 container create 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, tcib_managed=true, container_name=ovn_controller)
Feb 27 16:58:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:58:41 compute-0 python3[95891]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 27 16:58:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:58:41 compute-0 sudo[95888]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:41 compute-0 sudo[96111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypqaabbyhabdlwavtjctwfcsitsemwph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211521.674755-566-211116562187383/AnsiballZ_stat.py'
Feb 27 16:58:41 compute-0 sudo[96111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:42 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 27 16:58:42 compute-0 python3.9[96114]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:58:42 compute-0 sudo[96111]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:42 compute-0 sudo[96266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovkqniekzyyjlwdiytjipmaqdzwyujxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211522.4974935-575-247443497678563/AnsiballZ_file.py'
Feb 27 16:58:42 compute-0 sudo[96266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:42 compute-0 python3.9[96269]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:43 compute-0 sudo[96266]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:43 compute-0 sudo[96343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmgpcyodmxqbblteexkhykocxifuocgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211522.4974935-575-247443497678563/AnsiballZ_stat.py'
Feb 27 16:58:43 compute-0 sudo[96343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:43 compute-0 python3.9[96346]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:58:43 compute-0 sudo[96343]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:43 compute-0 sudo[96495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpuqewyasojjcqlagqdkmbolbmtxuppz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211523.4199212-575-97976849607042/AnsiballZ_copy.py'
Feb 27 16:58:43 compute-0 sudo[96495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:44 compute-0 python3.9[96498]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772211523.4199212-575-97976849607042/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:44 compute-0 sudo[96495]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:44 compute-0 sudo[96572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-renamjdnqhazezewmxetuylzechnmtzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211523.4199212-575-97976849607042/AnsiballZ_systemd.py'
Feb 27 16:58:44 compute-0 sudo[96572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:44 compute-0 python3.9[96575]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 16:58:44 compute-0 systemd[1]: Reloading.
Feb 27 16:58:44 compute-0 systemd-rc-local-generator[96598]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:58:44 compute-0 systemd-sysv-generator[96601]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:58:44 compute-0 sudo[96572]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:45 compute-0 sudo[96690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnykjwpulelagkjasajdjcbfzpgwpvlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211523.4199212-575-97976849607042/AnsiballZ_systemd.py'
Feb 27 16:58:45 compute-0 sudo[96690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:45 compute-0 python3.9[96693]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:58:45 compute-0 systemd[1]: Reloading.
Feb 27 16:58:45 compute-0 systemd-rc-local-generator[96724]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:58:45 compute-0 systemd-sysv-generator[96728]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:58:45 compute-0 systemd[1]: Starting ovn_controller container...
Feb 27 16:58:45 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Feb 27 16:58:45 compute-0 systemd[1]: Started libcrun container.
Feb 27 16:58:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a814bfbf3b2f48250a91b37999f0ca1477bd2aebcf2ffcb7a7be9803e42ab584/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 27 16:58:46 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580.
Feb 27 16:58:46 compute-0 podman[96741]: 2026-02-27 16:58:46.02879352 +0000 UTC m=+0.230219761 container init 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 27 16:58:46 compute-0 ovn_controller[96756]: + sudo -E kolla_set_configs
Feb 27 16:58:46 compute-0 podman[96741]: 2026-02-27 16:58:46.064716568 +0000 UTC m=+0.266142829 container start 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 27 16:58:46 compute-0 edpm-start-podman-container[96741]: ovn_controller
Feb 27 16:58:46 compute-0 systemd[1]: Created slice User Slice of UID 0.
Feb 27 16:58:46 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 27 16:58:46 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 27 16:58:46 compute-0 edpm-start-podman-container[96740]: Creating additional drop-in dependency for "ovn_controller" (958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580)
Feb 27 16:58:46 compute-0 podman[96763]: 2026-02-27 16:58:46.173674774 +0000 UTC m=+0.095107289 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 27 16:58:46 compute-0 systemd[1]: Starting User Manager for UID 0...
Feb 27 16:58:46 compute-0 systemd[1]: Reloading.
Feb 27 16:58:46 compute-0 systemd[96800]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Feb 27 16:58:46 compute-0 systemd-rc-local-generator[96844]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:58:46 compute-0 systemd-sysv-generator[96849]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:58:46 compute-0 systemd[96800]: Queued start job for default target Main User Target.
Feb 27 16:58:46 compute-0 systemd[96800]: Created slice User Application Slice.
Feb 27 16:58:46 compute-0 systemd[96800]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 27 16:58:46 compute-0 systemd[96800]: Started Daily Cleanup of User's Temporary Directories.
Feb 27 16:58:46 compute-0 systemd[96800]: Reached target Paths.
Feb 27 16:58:46 compute-0 systemd[96800]: Reached target Timers.
Feb 27 16:58:46 compute-0 systemd[96800]: Starting D-Bus User Message Bus Socket...
Feb 27 16:58:46 compute-0 systemd[96800]: Starting Create User's Volatile Files and Directories...
Feb 27 16:58:46 compute-0 systemd[96800]: Listening on D-Bus User Message Bus Socket.
Feb 27 16:58:46 compute-0 systemd[96800]: Reached target Sockets.
Feb 27 16:58:46 compute-0 systemd[96800]: Finished Create User's Volatile Files and Directories.
Feb 27 16:58:46 compute-0 systemd[96800]: Reached target Basic System.
Feb 27 16:58:46 compute-0 systemd[96800]: Reached target Main User Target.
Feb 27 16:58:46 compute-0 systemd[96800]: Startup finished in 144ms.
Feb 27 16:58:46 compute-0 systemd[1]: Started User Manager for UID 0.
Feb 27 16:58:46 compute-0 systemd[1]: Started ovn_controller container.
Feb 27 16:58:46 compute-0 systemd[1]: 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580-617a1ead809fcbe9.service: Main process exited, code=exited, status=1/FAILURE
Feb 27 16:58:46 compute-0 systemd[1]: 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580-617a1ead809fcbe9.service: Failed with result 'exit-code'.
Feb 27 16:58:46 compute-0 systemd[1]: Started Session c1 of User root.
Feb 27 16:58:46 compute-0 sudo[96690]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:46 compute-0 ovn_controller[96756]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 27 16:58:46 compute-0 ovn_controller[96756]: INFO:__main__:Validating config file
Feb 27 16:58:46 compute-0 ovn_controller[96756]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 27 16:58:46 compute-0 ovn_controller[96756]: INFO:__main__:Writing out command to execute
Feb 27 16:58:46 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Feb 27 16:58:46 compute-0 ovn_controller[96756]: ++ cat /run_command
Feb 27 16:58:46 compute-0 ovn_controller[96756]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 27 16:58:46 compute-0 ovn_controller[96756]: + ARGS=
Feb 27 16:58:46 compute-0 ovn_controller[96756]: + sudo kolla_copy_cacerts
Feb 27 16:58:46 compute-0 systemd[1]: Started Session c2 of User root.
Feb 27 16:58:46 compute-0 ovn_controller[96756]: + [[ ! -n '' ]]
Feb 27 16:58:46 compute-0 ovn_controller[96756]: + . kolla_extend_start
Feb 27 16:58:46 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Feb 27 16:58:46 compute-0 ovn_controller[96756]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Feb 27 16:58:46 compute-0 ovn_controller[96756]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Feb 27 16:58:46 compute-0 ovn_controller[96756]: + umask 0022
Feb 27 16:58:46 compute-0 ovn_controller[96756]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 27 16:58:46 compute-0 NetworkManager[56537]: <info>  [1772211526.6151] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Feb 27 16:58:46 compute-0 NetworkManager[56537]: <info>  [1772211526.6160] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 16:58:46 compute-0 NetworkManager[56537]: <warn>  [1772211526.6164] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 27 16:58:46 compute-0 NetworkManager[56537]: <info>  [1772211526.6174] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Feb 27 16:58:46 compute-0 NetworkManager[56537]: <info>  [1772211526.6182] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Feb 27 16:58:46 compute-0 NetworkManager[56537]: <info>  [1772211526.6188] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 27 16:58:46 compute-0 kernel: br-int: entered promiscuous mode
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00014|main|INFO|OVS feature set changed, force recompute.
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00022|main|INFO|OVS feature set changed, force recompute.
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 27 16:58:46 compute-0 ovn_controller[96756]: 2026-02-27T16:58:46Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 27 16:58:46 compute-0 NetworkManager[56537]: <info>  [1772211526.6450] manager: (ovn-196337-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Feb 27 16:58:46 compute-0 systemd-udevd[96898]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 16:58:46 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Feb 27 16:58:46 compute-0 systemd-udevd[96899]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 16:58:46 compute-0 NetworkManager[56537]: <info>  [1772211526.6694] device (genev_sys_6081): carrier: link connected
Feb 27 16:58:46 compute-0 NetworkManager[56537]: <info>  [1772211526.6697] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Feb 27 16:58:47 compute-0 python3.9[97028]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 27 16:58:49 compute-0 sudo[97179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipofngrfvepagusxoizshutpfprgrgqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211527.9220862-620-177407794181552/AnsiballZ_stat.py'
Feb 27 16:58:49 compute-0 sudo[97179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:49 compute-0 python3.9[97182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:58:49 compute-0 sudo[97179]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:50 compute-0 sudo[97303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuvirkjtvsrcoqemzxiwmxwlhrfubtrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211527.9220862-620-177407794181552/AnsiballZ_copy.py'
Feb 27 16:58:50 compute-0 sudo[97303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:50 compute-0 python3.9[97306]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211527.9220862-620-177407794181552/.source.yaml _original_basename=.h_e3nqkk follow=False checksum=75e5f7f62080450bcab2e07c24c09e40c0a79178 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:58:50 compute-0 sudo[97303]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:50 compute-0 sudo[97456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxdqsegdnuqkettctmkgxsuzackffyga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211530.5475523-635-73349159087054/AnsiballZ_command.py'
Feb 27 16:58:50 compute-0 sudo[97456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:51 compute-0 python3.9[97459]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:58:51 compute-0 ovs-vsctl[97460]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 27 16:58:51 compute-0 sudo[97456]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:51 compute-0 sudo[97610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvcisjmknkefmbfmqwtxrkxvdrqblggi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211531.2786345-643-123812393360353/AnsiballZ_command.py'
Feb 27 16:58:51 compute-0 sudo[97610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:51 compute-0 python3.9[97613]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:58:51 compute-0 ovs-vsctl[97615]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 27 16:58:51 compute-0 sudo[97610]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:52 compute-0 sudo[97766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgzqkbhsxpvpmsoocdybojbgwjjboxnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211532.162114-657-61013669346714/AnsiballZ_command.py'
Feb 27 16:58:52 compute-0 sudo[97766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:58:52 compute-0 python3.9[97769]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:58:52 compute-0 ovs-vsctl[97770]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 27 16:58:52 compute-0 sudo[97766]: pam_unix(sudo:session): session closed for user root
Feb 27 16:58:53 compute-0 sshd-session[86189]: Connection closed by 192.168.122.30 port 47122
Feb 27 16:58:53 compute-0 sshd-session[86186]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:58:53 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Feb 27 16:58:53 compute-0 systemd[1]: session-19.scope: Consumed 45.228s CPU time.
Feb 27 16:58:53 compute-0 systemd-logind[803]: Session 19 logged out. Waiting for processes to exit.
Feb 27 16:58:53 compute-0 systemd-logind[803]: Removed session 19.
Feb 27 16:58:56 compute-0 systemd[1]: Stopping User Manager for UID 0...
Feb 27 16:58:56 compute-0 systemd[96800]: Activating special unit Exit the Session...
Feb 27 16:58:56 compute-0 systemd[96800]: Stopped target Main User Target.
Feb 27 16:58:56 compute-0 systemd[96800]: Stopped target Basic System.
Feb 27 16:58:56 compute-0 systemd[96800]: Stopped target Paths.
Feb 27 16:58:56 compute-0 systemd[96800]: Stopped target Sockets.
Feb 27 16:58:56 compute-0 systemd[96800]: Stopped target Timers.
Feb 27 16:58:56 compute-0 systemd[96800]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 27 16:58:56 compute-0 systemd[96800]: Closed D-Bus User Message Bus Socket.
Feb 27 16:58:56 compute-0 systemd[96800]: Stopped Create User's Volatile Files and Directories.
Feb 27 16:58:56 compute-0 systemd[96800]: Removed slice User Application Slice.
Feb 27 16:58:56 compute-0 systemd[96800]: Reached target Shutdown.
Feb 27 16:58:56 compute-0 systemd[96800]: Finished Exit the Session.
Feb 27 16:58:56 compute-0 systemd[96800]: Reached target Exit the Session.
Feb 27 16:58:56 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Feb 27 16:58:56 compute-0 systemd[1]: Stopped User Manager for UID 0.
Feb 27 16:58:56 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 27 16:58:56 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 27 16:58:56 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 27 16:58:56 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 27 16:58:56 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Feb 27 16:58:58 compute-0 sshd-session[97797]: Accepted publickey for zuul from 192.168.122.30 port 56674 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:58:58 compute-0 systemd-logind[803]: New session 21 of user zuul.
Feb 27 16:58:59 compute-0 systemd[1]: Started Session 21 of User zuul.
Feb 27 16:58:59 compute-0 sshd-session[97797]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:58:59 compute-0 python3.9[97950]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:59:00 compute-0 sudo[98104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acgletnkkkxhptvecivzevqatzhzjrkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211540.491678-29-84542447813593/AnsiballZ_file.py'
Feb 27 16:59:00 compute-0 sudo[98104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:01 compute-0 python3.9[98107]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:01 compute-0 sudo[98104]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:01 compute-0 sudo[98257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgbhgcixmxjgwkdnlmblhhuewzccyqsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211541.2638147-29-226653672866982/AnsiballZ_file.py'
Feb 27 16:59:01 compute-0 sudo[98257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:01 compute-0 python3.9[98260]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:01 compute-0 sudo[98257]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:02 compute-0 sudo[98410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shpmkdcwhbifbrxaxlywmjpihtqujguq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211541.9363718-29-92904325998381/AnsiballZ_file.py'
Feb 27 16:59:02 compute-0 sudo[98410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:02 compute-0 python3.9[98413]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:02 compute-0 sudo[98410]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:02 compute-0 sudo[98563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzgemxlxnibuvgoogcgftdcaounmlywr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211542.6017308-29-67444294795417/AnsiballZ_file.py'
Feb 27 16:59:02 compute-0 sudo[98563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:03 compute-0 python3.9[98566]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:03 compute-0 sudo[98563]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:03 compute-0 sudo[98716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anxbxjwyhzyxuhaezilqkcdlfnfoeddg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211543.1585312-29-228828852824956/AnsiballZ_file.py'
Feb 27 16:59:03 compute-0 sudo[98716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:03 compute-0 python3.9[98719]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:03 compute-0 sudo[98716]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:04 compute-0 python3.9[98869]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:59:04 compute-0 sudo[99020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hupvjquextyyneanuzjxevbgqpedwxfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211544.5708873-73-255056470686835/AnsiballZ_seboolean.py'
Feb 27 16:59:04 compute-0 sudo[99020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:05 compute-0 python3.9[99023]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 27 16:59:05 compute-0 sudo[99020]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:06 compute-0 python3.9[99173]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:07 compute-0 python3.9[99294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211545.9778507-81-281150577729646/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:07 compute-0 python3.9[99444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:08 compute-0 python3.9[99565]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211547.4757636-96-82927143150388/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:09 compute-0 sudo[99715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqkqtsmdcqvzavccyopottbnuzrfvkye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211548.9204938-113-237797936127571/AnsiballZ_setup.py'
Feb 27 16:59:09 compute-0 sudo[99715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:09 compute-0 python3.9[99718]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 16:59:09 compute-0 sudo[99715]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:10 compute-0 sudo[99800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xttphvthsoehgrmzunfgjifnxpeyuanl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211548.9204938-113-237797936127571/AnsiballZ_dnf.py'
Feb 27 16:59:10 compute-0 sudo[99800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:10 compute-0 python3.9[99803]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 16:59:11 compute-0 sudo[99800]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:12 compute-0 sudo[99954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlsejdjjrqzpqxipjevuwiptrobjcgky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211551.9745152-125-238770955355587/AnsiballZ_systemd.py'
Feb 27 16:59:12 compute-0 sudo[99954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:12 compute-0 python3.9[99957]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 27 16:59:13 compute-0 sudo[99954]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:13 compute-0 python3.9[100110]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:14 compute-0 python3.9[100231]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211553.2370872-133-107853424527434/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:14 compute-0 python3.9[100381]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:15 compute-0 python3.9[100502]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211554.5359192-133-136779430318217/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:16 compute-0 ovn_controller[96756]: 2026-02-27T16:59:16Z|00025|memory|INFO|16128 kB peak resident set size after 30.1 seconds
Feb 27 16:59:16 compute-0 ovn_controller[96756]: 2026-02-27T16:59:16Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Feb 27 16:59:16 compute-0 podman[100603]: 2026-02-27 16:59:16.693843759 +0000 UTC m=+0.097741157 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 27 16:59:16 compute-0 python3.9[100674]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:17 compute-0 python3.9[100799]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211556.3910317-177-243901911014669/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:17 compute-0 python3.9[100949]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:18 compute-0 python3.9[101070]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211557.5243433-177-95313565516400/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:19 compute-0 python3.9[101220]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:59:19 compute-0 sudo[101372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osgnhzwkcwuntrazxzunzgcuaimzguqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211559.4584014-215-274255626494253/AnsiballZ_file.py'
Feb 27 16:59:19 compute-0 sudo[101372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:20 compute-0 python3.9[101375]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:20 compute-0 sudo[101372]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:20 compute-0 sudo[101525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfaaljzxdlzmrbwtaokdittctibcxngw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211560.238408-223-106560980132548/AnsiballZ_stat.py'
Feb 27 16:59:20 compute-0 sudo[101525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:20 compute-0 python3.9[101528]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:20 compute-0 sudo[101525]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:20 compute-0 sudo[101604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgivsefkauqkwljfwbzhqwasqybrytdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211560.238408-223-106560980132548/AnsiballZ_file.py'
Feb 27 16:59:20 compute-0 sudo[101604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:21 compute-0 python3.9[101607]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:21 compute-0 sudo[101604]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:21 compute-0 sudo[101757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwbdhayyczncuwnvueeqtgibkpavqdnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211561.30234-223-101956519135290/AnsiballZ_stat.py'
Feb 27 16:59:21 compute-0 sudo[101757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:21 compute-0 python3.9[101760]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:21 compute-0 sudo[101757]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:22 compute-0 sudo[101836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-recvffvowvbpwjnumgasfmagwcrdkrcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211561.30234-223-101956519135290/AnsiballZ_file.py'
Feb 27 16:59:22 compute-0 sudo[101836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:22 compute-0 python3.9[101839]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:22 compute-0 sudo[101836]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:22 compute-0 sudo[101989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzdtkgzncyeexbspsoqgkvmszrvuxarl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211562.5668766-246-17302582101631/AnsiballZ_file.py'
Feb 27 16:59:22 compute-0 sudo[101989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:22 compute-0 python3.9[101992]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:59:23 compute-0 sudo[101989]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:23 compute-0 sudo[102142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhvambrvcmijnzdbjdviettozwsifbqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211563.223742-254-236208985641518/AnsiballZ_stat.py'
Feb 27 16:59:23 compute-0 sudo[102142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:23 compute-0 python3.9[102145]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:23 compute-0 sudo[102142]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:23 compute-0 sudo[102221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrcfdpzlxxokiuwsarujrvfvldcvrrwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211563.223742-254-236208985641518/AnsiballZ_file.py'
Feb 27 16:59:23 compute-0 sudo[102221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:24 compute-0 python3.9[102224]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:59:24 compute-0 sudo[102221]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:24 compute-0 sudo[102374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfnfsujgpolfzssnldngfhktedcwtqww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211564.408479-266-235044983256104/AnsiballZ_stat.py'
Feb 27 16:59:24 compute-0 sudo[102374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:24 compute-0 python3.9[102377]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:24 compute-0 sudo[102374]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:25 compute-0 sudo[102453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aprrsbheliydllsjpjgxocnerbxjbmrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211564.408479-266-235044983256104/AnsiballZ_file.py'
Feb 27 16:59:25 compute-0 sudo[102453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:25 compute-0 python3.9[102456]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:59:25 compute-0 sudo[102453]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:25 compute-0 sudo[102606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgtblenuulxlbzuudrxouysolzwpnvda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211565.5691724-278-267139078939147/AnsiballZ_systemd.py'
Feb 27 16:59:25 compute-0 sudo[102606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:26 compute-0 python3.9[102609]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:59:26 compute-0 systemd[1]: Reloading.
Feb 27 16:59:26 compute-0 systemd-rc-local-generator[102631]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:59:26 compute-0 systemd-sysv-generator[102638]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:59:26 compute-0 sudo[102606]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:26 compute-0 sudo[102803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osuukdxoqksjawxpymwmminkhtbepodq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211566.6245344-286-77377485240663/AnsiballZ_stat.py'
Feb 27 16:59:26 compute-0 sudo[102803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:27 compute-0 python3.9[102806]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:27 compute-0 sudo[102803]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:27 compute-0 sudo[102882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqxfwtalcislmtmccwedacgbguefnshi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211566.6245344-286-77377485240663/AnsiballZ_file.py'
Feb 27 16:59:27 compute-0 sudo[102882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:27 compute-0 python3.9[102885]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:59:27 compute-0 sudo[102882]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:28 compute-0 sudo[103035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdtgrghlvubyfogredrmtoykbwbjlmkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211567.840146-298-53154848648497/AnsiballZ_stat.py'
Feb 27 16:59:28 compute-0 sudo[103035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:28 compute-0 python3.9[103038]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:28 compute-0 sudo[103035]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:28 compute-0 sudo[103114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wawvfhrsioukyqwmgbntlcjunpdikcdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211567.840146-298-53154848648497/AnsiballZ_file.py'
Feb 27 16:59:28 compute-0 sudo[103114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:28 compute-0 python3.9[103117]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:59:28 compute-0 sudo[103114]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:29 compute-0 sudo[103267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rclocycncodworniaypvzginipaqfhlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211569.0282786-310-38165001081104/AnsiballZ_systemd.py'
Feb 27 16:59:29 compute-0 sudo[103267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:29 compute-0 python3.9[103270]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:59:29 compute-0 systemd[1]: Reloading.
Feb 27 16:59:29 compute-0 systemd-rc-local-generator[103294]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:59:29 compute-0 systemd-sysv-generator[103299]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:59:29 compute-0 systemd[1]: Starting Create netns directory...
Feb 27 16:59:29 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 27 16:59:29 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 27 16:59:29 compute-0 systemd[1]: Finished Create netns directory.
Feb 27 16:59:29 compute-0 sudo[103267]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:30 compute-0 sudo[103467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mycalslgdnprpnpkajlljrvlotarzmci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211570.3268979-320-143391857079428/AnsiballZ_file.py'
Feb 27 16:59:30 compute-0 sudo[103467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:30 compute-0 python3.9[103470]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:30 compute-0 sudo[103467]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:31 compute-0 sudo[103620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-najvhysmqzyfstwecxzexjnddastqxdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211571.0400355-328-55787178749009/AnsiballZ_stat.py'
Feb 27 16:59:31 compute-0 sudo[103620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:31 compute-0 python3.9[103623]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:31 compute-0 sudo[103620]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:31 compute-0 sudo[103744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouekmgiolswbqfhhbosvgodygcocatju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211571.0400355-328-55787178749009/AnsiballZ_copy.py'
Feb 27 16:59:31 compute-0 sudo[103744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:32 compute-0 python3.9[103747]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211571.0400355-328-55787178749009/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:32 compute-0 sudo[103744]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:32 compute-0 sudo[103897]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxfxbdnizsbswhkfhckfikqsnpypbvne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211572.593173-345-116319304604072/AnsiballZ_file.py'
Feb 27 16:59:32 compute-0 sudo[103897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:33 compute-0 python3.9[103900]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:59:33 compute-0 sudo[103897]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:33 compute-0 sudo[104050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvhtzlivdbenlpemlvgmzfaagurfovva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211573.313486-353-147181562621528/AnsiballZ_file.py'
Feb 27 16:59:33 compute-0 sudo[104050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:33 compute-0 python3.9[104053]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 16:59:33 compute-0 sudo[104050]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:34 compute-0 sudo[104203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpwpoffvdlxtgkqcrngkmgsfiupftlqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211573.8771753-361-264166732538856/AnsiballZ_stat.py'
Feb 27 16:59:34 compute-0 sudo[104203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:34 compute-0 python3.9[104206]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:34 compute-0 sudo[104203]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:34 compute-0 sudo[104327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pniveinbvcschxjeafwsszlcgqmyykup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211573.8771753-361-264166732538856/AnsiballZ_copy.py'
Feb 27 16:59:34 compute-0 sudo[104327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:35 compute-0 python3.9[104330]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211573.8771753-361-264166732538856/.source.json _original_basename=.0t29v3ui follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:59:35 compute-0 sudo[104327]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:35 compute-0 python3.9[104480]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:59:37 compute-0 sudo[104901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzzzrurssznomqgeaunchdwwdaecrxna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211577.3973572-401-257258084312219/AnsiballZ_container_config_data.py'
Feb 27 16:59:37 compute-0 sudo[104901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:38 compute-0 python3.9[104904]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 27 16:59:38 compute-0 sudo[104901]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:38 compute-0 sudo[105054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfwwecxcdhmfhaqbeqmtiuvivrfjlliu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211578.3904002-412-251718382808195/AnsiballZ_container_config_hash.py'
Feb 27 16:59:38 compute-0 sudo[105054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:39 compute-0 python3.9[105057]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 27 16:59:39 compute-0 sudo[105054]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:39 compute-0 sudo[105207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgnjpmscqikyqwwbfglghvpytvdvoztk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772211579.3835263-422-265040840510035/AnsiballZ_edpm_container_manage.py'
Feb 27 16:59:39 compute-0 sudo[105207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:40 compute-0 python3[105210]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 27 16:59:40 compute-0 podman[105246]: 2026-02-27 16:59:40.327750865 +0000 UTC m=+0.064615540 container create adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0)
Feb 27 16:59:40 compute-0 podman[105246]: 2026-02-27 16:59:40.295559768 +0000 UTC m=+0.032424443 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 16:59:40 compute-0 python3[105210]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 16:59:40 compute-0 sudo[105207]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:40 compute-0 sudo[105434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfsxdctykzghnoufluccjhgkvbnmdheu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211580.6934607-430-82992299957515/AnsiballZ_stat.py'
Feb 27 16:59:41 compute-0 sudo[105434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:41 compute-0 python3.9[105437]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:59:41 compute-0 sudo[105434]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:42 compute-0 sudo[105589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltkkhxefkobhkbzvlryvihsciehpkovp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211581.6976862-439-33578300521099/AnsiballZ_file.py'
Feb 27 16:59:42 compute-0 sudo[105589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:42 compute-0 python3.9[105592]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:59:42 compute-0 sudo[105589]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:42 compute-0 sudo[105666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvczcqtnhimfnxdbmdqacnwbnccqnsml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211581.6976862-439-33578300521099/AnsiballZ_stat.py'
Feb 27 16:59:42 compute-0 sudo[105666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:42 compute-0 python3.9[105669]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 16:59:42 compute-0 sudo[105666]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:43 compute-0 sudo[105818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vnwnnhxarbrovzkohxakmjbluhexqnnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211582.7264028-439-51452261712198/AnsiballZ_copy.py'
Feb 27 16:59:43 compute-0 sudo[105818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:43 compute-0 python3.9[105821]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772211582.7264028-439-51452261712198/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:59:43 compute-0 sudo[105818]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:43 compute-0 sudo[105895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svpaagkxezxyijzexzhqhoruxrzkgudi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211582.7264028-439-51452261712198/AnsiballZ_systemd.py'
Feb 27 16:59:43 compute-0 sudo[105895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:44 compute-0 python3.9[105898]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 16:59:44 compute-0 systemd[1]: Reloading.
Feb 27 16:59:44 compute-0 systemd-rc-local-generator[105917]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:59:44 compute-0 systemd-sysv-generator[105925]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:59:44 compute-0 sudo[105895]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:44 compute-0 sudo[106013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acbmkzqzwtipvufrvoifkkjwpwsamsyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211582.7264028-439-51452261712198/AnsiballZ_systemd.py'
Feb 27 16:59:44 compute-0 sudo[106013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:44 compute-0 python3.9[106016]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 16:59:45 compute-0 systemd[1]: Reloading.
Feb 27 16:59:45 compute-0 systemd-sysv-generator[106044]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:59:45 compute-0 systemd-rc-local-generator[106041]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:59:45 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Feb 27 16:59:45 compute-0 systemd[1]: Started libcrun container.
Feb 27 16:59:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a48d7dc69822637ef331c607083dde7e627926c69701cf46aeca6973eda3897c/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 27 16:59:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a48d7dc69822637ef331c607083dde7e627926c69701cf46aeca6973eda3897c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 16:59:45 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366.
Feb 27 16:59:45 compute-0 podman[106064]: 2026-02-27 16:59:45.360177516 +0000 UTC m=+0.143206796 container init adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260223, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: + sudo -E kolla_set_configs
Feb 27 16:59:45 compute-0 podman[106064]: 2026-02-27 16:59:45.386907098 +0000 UTC m=+0.169936368 container start adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, io.buildah.version=1.43.0)
Feb 27 16:59:45 compute-0 edpm-start-podman-container[106064]: ovn_metadata_agent
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Validating config file
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Copying service configuration files
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Writing out command to execute
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: ++ cat /run_command
Feb 27 16:59:45 compute-0 edpm-start-podman-container[106063]: Creating additional drop-in dependency for "ovn_metadata_agent" (adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366)
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: + CMD=neutron-ovn-metadata-agent
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: + ARGS=
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: + sudo kolla_copy_cacerts
Feb 27 16:59:45 compute-0 systemd[1]: Reloading.
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: + [[ ! -n '' ]]
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: + . kolla_extend_start
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: Running command: 'neutron-ovn-metadata-agent'
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: + umask 0022
Feb 27 16:59:45 compute-0 ovn_metadata_agent[106080]: + exec neutron-ovn-metadata-agent
Feb 27 16:59:45 compute-0 podman[106087]: 2026-02-27 16:59:45.5049524 +0000 UTC m=+0.104221811 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 27 16:59:45 compute-0 systemd-rc-local-generator[106156]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:59:45 compute-0 systemd-sysv-generator[106160]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:59:45 compute-0 systemd[1]: Started ovn_metadata_agent container.
Feb 27 16:59:45 compute-0 sudo[106013]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:46 compute-0 python3.9[106327]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.029 106085 INFO neutron.common.config [-] Logging enabled!
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.029 106085 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.029 106085 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.030 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.030 106085 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.030 106085 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.030 106085 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.030 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.030 106085 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.030 106085 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.031 106085 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.031 106085 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.031 106085 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.031 106085 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.031 106085 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.031 106085 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.031 106085 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.031 106085 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.031 106085 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.031 106085 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.032 106085 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.032 106085 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.032 106085 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.032 106085 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.032 106085 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.032 106085 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.032 106085 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.032 106085 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.032 106085 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.032 106085 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.033 106085 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.033 106085 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.033 106085 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.033 106085 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.033 106085 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.033 106085 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.033 106085 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.033 106085 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.033 106085 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.034 106085 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.034 106085 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.034 106085 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.034 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.034 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.034 106085 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.034 106085 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.034 106085 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.034 106085 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.034 106085 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.035 106085 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.035 106085 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.035 106085 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.035 106085 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.035 106085 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.035 106085 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.035 106085 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.035 106085 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.035 106085 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.036 106085 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.036 106085 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.036 106085 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.036 106085 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.036 106085 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.036 106085 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.036 106085 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.036 106085 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.036 106085 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.036 106085 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.037 106085 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.037 106085 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.037 106085 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.037 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.037 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.037 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.037 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.037 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.037 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.038 106085 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.038 106085 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.038 106085 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.038 106085 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.038 106085 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.038 106085 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.038 106085 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.038 106085 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.038 106085 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.038 106085 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.039 106085 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.039 106085 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.039 106085 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.039 106085 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.039 106085 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.039 106085 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.039 106085 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.039 106085 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.039 106085 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.040 106085 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.040 106085 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.040 106085 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.040 106085 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.040 106085 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.040 106085 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.040 106085 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.040 106085 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.040 106085 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.040 106085 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.040 106085 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.041 106085 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.041 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.041 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.041 106085 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.041 106085 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.041 106085 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.041 106085 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.041 106085 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.041 106085 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.042 106085 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.042 106085 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.042 106085 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.042 106085 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.042 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.042 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.042 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.042 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.042 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.043 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.043 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.043 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.043 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.043 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.043 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.043 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.043 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.043 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.043 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.044 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.044 106085 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.044 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.044 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.044 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.044 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.044 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.044 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.044 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.045 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.045 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.045 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.045 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.045 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.045 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.045 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.045 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.045 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.045 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.046 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.046 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.046 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.046 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.046 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.046 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.046 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.046 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.046 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.047 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.047 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.047 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.047 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.047 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.047 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.047 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.047 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.047 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.048 106085 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.048 106085 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.048 106085 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.048 106085 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.048 106085 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.048 106085 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.048 106085 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.048 106085 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.048 106085 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.048 106085 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.049 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.049 106085 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.049 106085 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.049 106085 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.049 106085 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.049 106085 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.049 106085 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.049 106085 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.049 106085 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.050 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.050 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.050 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.050 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.050 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.050 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.050 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.050 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.050 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.050 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.051 106085 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.051 106085 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.051 106085 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.051 106085 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.051 106085 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.051 106085 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.051 106085 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.051 106085 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.051 106085 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.051 106085 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.052 106085 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.052 106085 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.052 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.052 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.052 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.052 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.052 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.052 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.052 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.052 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.053 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.053 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.053 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.053 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.053 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.053 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.053 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.053 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.053 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.053 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.053 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.054 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.054 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.054 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.054 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.054 106085 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.054 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.054 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.054 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.054 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.055 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.055 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.055 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.055 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.055 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.055 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.055 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.055 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.055 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.055 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.056 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.056 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.056 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.056 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.056 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.056 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.056 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.056 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.057 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.057 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.057 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.057 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.057 106085 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.057 106085 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.057 106085 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.057 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.057 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.058 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.058 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.058 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.058 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.058 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.058 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.058 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.058 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.058 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.058 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.059 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.059 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.059 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.059 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.059 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.059 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.059 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.059 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.059 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.060 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.060 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.060 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.060 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.060 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.060 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.060 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.060 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.060 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.060 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.061 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.061 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.061 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.061 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.061 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.061 106085 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.061 106085 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.070 106085 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.070 106085 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.070 106085 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.070 106085 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.070 106085 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.082 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 114486db-e8a8-4651-8c2f-bcfde6c6e156 (UUID: 114486db-e8a8-4651-8c2f-bcfde6c6e156) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.108 106085 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.108 106085 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.108 106085 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.108 106085 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.112 106085 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.118 106085 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.122 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '114486db-e8a8-4651-8c2f-bcfde6c6e156'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], external_ids={}, name=114486db-e8a8-4651-8c2f-bcfde6c6e156, nb_cfg_timestamp=1772211534645, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.123 106085 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7efc24297b50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.124 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.124 106085 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.124 106085 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.124 106085 INFO oslo_service.service [-] Starting 1 workers
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.127 106085 DEBUG oslo_service.service [-] Started child 106352 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.129 106085 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpu8wxxgkq/privsep.sock']
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.131 106352 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-499518'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.163 106352 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.164 106352 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.164 106352 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.169 106352 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.177 106352 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.186 106352 INFO eventlet.wsgi.server [-] (106352) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Feb 27 16:59:47 compute-0 sudo[106494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqnhexyywmptipvwsotlmsreknftxccg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211587.236982-484-75406401970096/AnsiballZ_stat.py'
Feb 27 16:59:47 compute-0 sudo[106494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:47 compute-0 podman[106456]: 2026-02-27 16:59:47.654013246 +0000 UTC m=+0.140034487 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.43.0)
Feb 27 16:59:47 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.747 106085 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.748 106085 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpu8wxxgkq/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.658 106512 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.662 106512 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.664 106512 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.665 106512 INFO oslo.privsep.daemon [-] privsep daemon running as pid 106512
Feb 27 16:59:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:47.751 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[87cacabe-fa62-42a6-a140-1bfb5e927d03]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 16:59:47 compute-0 python3.9[106502]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 16:59:47 compute-0 sudo[106494]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:48 compute-0 sudo[106639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpwdkrxvaqxejxbdyvuforjjizwtghhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211587.236982-484-75406401970096/AnsiballZ_copy.py'
Feb 27 16:59:48 compute-0 sudo[106639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.225 106512 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.225 106512 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.226 106512 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.680 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[8520379a-bf93-4385-88ef-b73f228f0509]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.684 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, column=external_ids, values=({'neutron:ovn-metadata-id': 'd23ed8b8-dd94-5021-b744-a08f9ea05f8b'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 16:59:48 compute-0 python3.9[106642]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211587.236982-484-75406401970096/.source.yaml _original_basename=._x63chm_ follow=False checksum=8c7428421f9c6988e32e62535456a60f572eebb6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 16:59:48 compute-0 sudo[106639]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.723 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.730 106085 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.730 106085 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.730 106085 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.730 106085 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.730 106085 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.730 106085 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.731 106085 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.731 106085 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.731 106085 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.731 106085 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.731 106085 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.731 106085 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.731 106085 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.731 106085 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.731 106085 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.732 106085 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.732 106085 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.732 106085 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.732 106085 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.732 106085 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.732 106085 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.732 106085 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.732 106085 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.732 106085 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.733 106085 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.733 106085 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.733 106085 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.733 106085 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.733 106085 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.733 106085 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.733 106085 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.733 106085 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.733 106085 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.734 106085 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.734 106085 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.734 106085 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.734 106085 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.734 106085 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.734 106085 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.734 106085 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.734 106085 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.735 106085 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.735 106085 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.735 106085 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.735 106085 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.735 106085 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.735 106085 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.735 106085 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.735 106085 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.735 106085 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.736 106085 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.736 106085 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.736 106085 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.736 106085 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.736 106085 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.736 106085 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.736 106085 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.736 106085 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.736 106085 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.736 106085 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.737 106085 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.737 106085 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.737 106085 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.737 106085 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.737 106085 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.737 106085 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.737 106085 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.737 106085 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.737 106085 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.737 106085 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.738 106085 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.738 106085 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.738 106085 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.738 106085 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.738 106085 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.738 106085 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.738 106085 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.738 106085 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.738 106085 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.739 106085 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.739 106085 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.739 106085 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.739 106085 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.739 106085 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.739 106085 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.739 106085 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.740 106085 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.740 106085 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.740 106085 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.740 106085 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.740 106085 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.740 106085 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.740 106085 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.740 106085 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.741 106085 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.741 106085 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.741 106085 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.741 106085 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.741 106085 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.741 106085 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.741 106085 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.741 106085 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.741 106085 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.741 106085 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.742 106085 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.742 106085 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.742 106085 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.742 106085 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.742 106085 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.742 106085 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.742 106085 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.742 106085 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.742 106085 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.743 106085 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.743 106085 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.743 106085 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.743 106085 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.743 106085 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.743 106085 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.743 106085 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.743 106085 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.743 106085 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.744 106085 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.744 106085 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.744 106085 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.744 106085 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.744 106085 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.744 106085 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.744 106085 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.744 106085 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.744 106085 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.745 106085 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.745 106085 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.745 106085 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.745 106085 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.745 106085 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.745 106085 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.745 106085 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.745 106085 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.745 106085 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.746 106085 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.746 106085 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.746 106085 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.746 106085 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.746 106085 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.746 106085 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.746 106085 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.746 106085 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.746 106085 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.746 106085 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.747 106085 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.747 106085 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.747 106085 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.747 106085 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.747 106085 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.747 106085 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.747 106085 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.747 106085 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.747 106085 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.747 106085 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.747 106085 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.748 106085 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.748 106085 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.748 106085 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.748 106085 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.748 106085 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.748 106085 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.748 106085 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.748 106085 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.748 106085 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.748 106085 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.749 106085 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.749 106085 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.749 106085 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.749 106085 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.749 106085 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.749 106085 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.749 106085 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.749 106085 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.749 106085 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.749 106085 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.750 106085 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.750 106085 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.750 106085 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.750 106085 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.750 106085 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.750 106085 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.750 106085 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.750 106085 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.750 106085 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.750 106085 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.751 106085 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.751 106085 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.751 106085 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.751 106085 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.751 106085 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.751 106085 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.751 106085 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.751 106085 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.751 106085 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.752 106085 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.752 106085 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.752 106085 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.752 106085 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.752 106085 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.752 106085 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.752 106085 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.752 106085 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.752 106085 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.752 106085 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.752 106085 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.753 106085 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.753 106085 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.753 106085 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.753 106085 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.753 106085 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.753 106085 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.753 106085 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.753 106085 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.753 106085 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.754 106085 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.754 106085 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.754 106085 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.754 106085 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.754 106085 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.754 106085 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.754 106085 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.754 106085 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.754 106085 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.754 106085 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.754 106085 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.755 106085 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.755 106085 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.755 106085 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.755 106085 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.755 106085 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.755 106085 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.755 106085 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.755 106085 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.755 106085 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.755 106085 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.756 106085 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.756 106085 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.756 106085 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.756 106085 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.756 106085 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.756 106085 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.756 106085 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.756 106085 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.756 106085 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.757 106085 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.757 106085 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.757 106085 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.757 106085 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.757 106085 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.757 106085 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.757 106085 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.757 106085 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.757 106085 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.758 106085 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.758 106085 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.758 106085 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.758 106085 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.758 106085 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.758 106085 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.758 106085 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.758 106085 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.758 106085 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.759 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.759 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.759 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.759 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.759 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.759 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.759 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.759 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.759 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.759 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.760 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.760 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.760 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.760 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.760 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.760 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.760 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.760 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.761 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.761 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.761 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.761 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.761 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.761 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.761 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.761 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.761 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.761 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.762 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.762 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.762 106085 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.762 106085 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.762 106085 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.762 106085 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.762 106085 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 16:59:48 compute-0 ovn_metadata_agent[106080]: 2026-02-27 16:59:48.762 106085 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 27 16:59:49 compute-0 sshd-session[97800]: Connection closed by 192.168.122.30 port 56674
Feb 27 16:59:49 compute-0 sshd-session[97797]: pam_unix(sshd:session): session closed for user zuul
Feb 27 16:59:49 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Feb 27 16:59:49 compute-0 systemd[1]: session-21.scope: Consumed 33.707s CPU time.
Feb 27 16:59:49 compute-0 systemd-logind[803]: Session 21 logged out. Waiting for processes to exit.
Feb 27 16:59:49 compute-0 systemd-logind[803]: Removed session 21.
Feb 27 16:59:54 compute-0 sshd-session[106667]: Accepted publickey for zuul from 192.168.122.30 port 58468 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 16:59:54 compute-0 systemd-logind[803]: New session 22 of user zuul.
Feb 27 16:59:54 compute-0 systemd[1]: Started Session 22 of User zuul.
Feb 27 16:59:54 compute-0 sshd-session[106667]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 16:59:55 compute-0 python3.9[106820]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 16:59:56 compute-0 sudo[106974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wklmpyevvqclhameaclqwzyphvccfzxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211596.2454069-29-111460017751350/AnsiballZ_command.py'
Feb 27 16:59:56 compute-0 sudo[106974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:56 compute-0 python3.9[106977]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 16:59:57 compute-0 sudo[106974]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:57 compute-0 sudo[107140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htxmjgekbhopzqepvpebohmuwysncfat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211597.3824635-40-235823832998714/AnsiballZ_systemd_service.py'
Feb 27 16:59:57 compute-0 sudo[107140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 16:59:58 compute-0 python3.9[107143]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 16:59:58 compute-0 systemd[1]: Reloading.
Feb 27 16:59:58 compute-0 systemd-rc-local-generator[107173]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 16:59:58 compute-0 systemd-sysv-generator[107176]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 16:59:58 compute-0 sudo[107140]: pam_unix(sudo:session): session closed for user root
Feb 27 16:59:59 compute-0 python3.9[107336]: ansible-ansible.builtin.service_facts Invoked
Feb 27 16:59:59 compute-0 network[107353]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 27 16:59:59 compute-0 network[107354]: 'network-scripts' will be removed from distribution in near future.
Feb 27 16:59:59 compute-0 network[107355]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 27 17:00:01 compute-0 sudo[107615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgttxrmynwbnhblcbbdornzobbxrivhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211601.6761575-59-71324036094448/AnsiballZ_systemd_service.py'
Feb 27 17:00:01 compute-0 sudo[107615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:02 compute-0 python3.9[107618]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:00:02 compute-0 sudo[107615]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:02 compute-0 sudo[107769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdccaszmergesmmjaekeipjhpsniaaty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211602.4116802-59-156727382477643/AnsiballZ_systemd_service.py'
Feb 27 17:00:02 compute-0 sudo[107769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:02 compute-0 python3.9[107772]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:00:03 compute-0 sudo[107769]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:03 compute-0 sudo[107923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dsbwzyczkdzmsgkbhfseecaeukenfogi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211603.1500702-59-224977572222754/AnsiballZ_systemd_service.py'
Feb 27 17:00:03 compute-0 sudo[107923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:03 compute-0 python3.9[107926]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:00:03 compute-0 sudo[107923]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:04 compute-0 sudo[108077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmvachiyjrttuggwngtvjadxlbrwefns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211603.9561462-59-207386550195761/AnsiballZ_systemd_service.py'
Feb 27 17:00:04 compute-0 sudo[108077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:05 compute-0 python3.9[108080]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:00:05 compute-0 sudo[108077]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:05 compute-0 sudo[108231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfwjhleynjgtcfosixbilckqpgzztloy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211605.3234613-59-202784590145431/AnsiballZ_systemd_service.py'
Feb 27 17:00:05 compute-0 sudo[108231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:06 compute-0 python3.9[108234]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:00:06 compute-0 sudo[108231]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:06 compute-0 sudo[108385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqgubifctdfjwosgdabhpnsxjswtixte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211606.230508-59-204529923436698/AnsiballZ_systemd_service.py'
Feb 27 17:00:06 compute-0 sudo[108385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:06 compute-0 python3.9[108388]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:00:06 compute-0 sudo[108385]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:07 compute-0 sudo[108539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gahnyzfnozsavfvmqavtbmsywrcllhpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211607.045774-59-24992901806697/AnsiballZ_systemd_service.py'
Feb 27 17:00:07 compute-0 sudo[108539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:07 compute-0 python3.9[108542]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:00:07 compute-0 sudo[108539]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:08 compute-0 sudo[108693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zogjjjbedujgfeslvonjrqhsasrbjfrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211608.0132039-111-103918306578478/AnsiballZ_file.py'
Feb 27 17:00:08 compute-0 sudo[108693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:08 compute-0 python3.9[108696]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:08 compute-0 sudo[108693]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:09 compute-0 sudo[108846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whlfygtpgnfkwwidjlwtqqjbthjqtqox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211608.8830342-111-269927832910897/AnsiballZ_file.py'
Feb 27 17:00:09 compute-0 sudo[108846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:09 compute-0 python3.9[108849]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:09 compute-0 sudo[108846]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:09 compute-0 sudo[108999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtfcskpupbbjlxyuoxlosbzzobguidlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211609.4928045-111-198433090856556/AnsiballZ_file.py'
Feb 27 17:00:09 compute-0 sudo[108999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:09 compute-0 python3.9[109002]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:10 compute-0 sudo[108999]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:10 compute-0 sudo[109152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpkkvqprwioyymlbkdmtegzdvshnnwly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211610.162935-111-252193955611189/AnsiballZ_file.py'
Feb 27 17:00:10 compute-0 sudo[109152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:10 compute-0 python3.9[109155]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:10 compute-0 sudo[109152]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:11 compute-0 sudo[109305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zljwvxkjupuluzztpxnhcdocllwtfezt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211610.9128346-111-5342742511117/AnsiballZ_file.py'
Feb 27 17:00:11 compute-0 sudo[109305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:11 compute-0 python3.9[109308]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:11 compute-0 sudo[109305]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:11 compute-0 sudo[109458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkgrheryrnopzujgksojiaoiejvebjza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211611.585339-111-38034472905214/AnsiballZ_file.py'
Feb 27 17:00:11 compute-0 sudo[109458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:12 compute-0 python3.9[109461]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:12 compute-0 sudo[109458]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:12 compute-0 sudo[109611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmdblkdiilesenxldvgdstbzzpggmmkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211612.2823925-111-191085009708096/AnsiballZ_file.py'
Feb 27 17:00:12 compute-0 sudo[109611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:12 compute-0 python3.9[109614]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:12 compute-0 sudo[109611]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:13 compute-0 sudo[109764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbpadehjjuaaiwmspbdtlvrghjmyvrcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211612.9028728-161-108934275215235/AnsiballZ_file.py'
Feb 27 17:00:13 compute-0 sudo[109764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:13 compute-0 python3.9[109767]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:13 compute-0 sudo[109764]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:13 compute-0 sudo[109917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxddszhirgkczevblirboifsucedvzfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211613.5709524-161-44448686382937/AnsiballZ_file.py'
Feb 27 17:00:13 compute-0 sudo[109917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:13 compute-0 python3.9[109920]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:13 compute-0 sudo[109917]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:14 compute-0 sudo[110070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sabeysgiktcjcihbbgxigweuceitrudm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211614.1220737-161-116619333779454/AnsiballZ_file.py'
Feb 27 17:00:14 compute-0 sudo[110070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:14 compute-0 python3.9[110073]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:14 compute-0 sudo[110070]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:14 compute-0 sudo[110223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdxerbhtfnedcohdbkohlhetwpnyutvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211614.731083-161-237247171606626/AnsiballZ_file.py'
Feb 27 17:00:14 compute-0 sudo[110223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:15 compute-0 python3.9[110226]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:15 compute-0 sudo[110223]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:15 compute-0 sudo[110376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjvgslyczjynimutaylniwugkasusvjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211615.2746112-161-268309259565544/AnsiballZ_file.py'
Feb 27 17:00:15 compute-0 sudo[110376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:15 compute-0 python3.9[110379]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:15 compute-0 sudo[110376]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:15 compute-0 podman[110380]: 2026-02-27 17:00:15.813226125 +0000 UTC m=+0.055387022 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 27 17:00:16 compute-0 sudo[110549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiepvetazclaezigzfllsiyobtcjehlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211615.8587508-161-143042607625785/AnsiballZ_file.py'
Feb 27 17:00:16 compute-0 sudo[110549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:16 compute-0 python3.9[110552]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:16 compute-0 sudo[110549]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:16 compute-0 sudo[110702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjbffnncruyhfeylsgikyrijbrgfgspc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211616.4784214-161-86540963646714/AnsiballZ_file.py'
Feb 27 17:00:16 compute-0 sudo[110702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:17 compute-0 python3.9[110705]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:00:17 compute-0 sudo[110702]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:17 compute-0 sudo[110855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxqpgujdknkmvsahppolnmdttwenayub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211617.2639914-212-187570045326295/AnsiballZ_command.py'
Feb 27 17:00:17 compute-0 sudo[110855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:17 compute-0 python3.9[110858]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:00:17 compute-0 sudo[110855]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:17 compute-0 podman[110861]: 2026-02-27 17:00:17.931015098 +0000 UTC m=+0.089813784 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:00:18 compute-0 python3.9[111036]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 27 17:00:19 compute-0 sudo[111186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aokakfbmlvodviplvqarabtzodrwgvue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211618.9123843-230-240898447412982/AnsiballZ_systemd_service.py'
Feb 27 17:00:19 compute-0 sudo[111186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:19 compute-0 python3.9[111189]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 17:00:19 compute-0 systemd[1]: Reloading.
Feb 27 17:00:19 compute-0 systemd-rc-local-generator[111209]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:00:19 compute-0 systemd-sysv-generator[111212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:00:19 compute-0 sudo[111186]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:20 compute-0 sudo[111380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otvdgocdmcmxnwqkexkqazbwzqqtiovw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211619.9159546-238-187103270064738/AnsiballZ_command.py'
Feb 27 17:00:20 compute-0 sudo[111380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:20 compute-0 python3.9[111383]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:00:20 compute-0 sudo[111380]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:21 compute-0 sudo[111534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnllczkbioomchfpdwxvbmvtnbjioayp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211620.696667-238-237662947807710/AnsiballZ_command.py'
Feb 27 17:00:21 compute-0 sudo[111534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:21 compute-0 python3.9[111537]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:00:21 compute-0 sudo[111534]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:22 compute-0 sudo[111688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxdwafkzjevatuysopsamondtpyyqvnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211621.7994716-238-8746205448425/AnsiballZ_command.py'
Feb 27 17:00:22 compute-0 sudo[111688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:22 compute-0 python3.9[111691]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:00:22 compute-0 sudo[111688]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:22 compute-0 sudo[111842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygapafhbhmvykqchshjtrlfjtisdhvqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211622.5560837-238-99010784019769/AnsiballZ_command.py'
Feb 27 17:00:22 compute-0 sudo[111842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:23 compute-0 python3.9[111845]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:00:23 compute-0 sudo[111842]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:23 compute-0 sudo[111996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brfqufqdfoobtjjwfenhtnqdyhjvjint ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211623.5887709-238-134099816679548/AnsiballZ_command.py'
Feb 27 17:00:23 compute-0 sudo[111996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:24 compute-0 python3.9[111999]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:00:24 compute-0 sudo[111996]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:24 compute-0 sudo[112150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yxvfrvuxyomyoopvzczdebfromhyhxsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211624.3876212-238-173017720061008/AnsiballZ_command.py'
Feb 27 17:00:24 compute-0 sudo[112150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:24 compute-0 python3.9[112153]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:00:24 compute-0 sudo[112150]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:25 compute-0 sudo[112304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nloybspjqrosfbujpphluypcmsqbfkfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211625.0105095-238-115869083032759/AnsiballZ_command.py'
Feb 27 17:00:25 compute-0 sudo[112304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:25 compute-0 python3.9[112307]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:00:25 compute-0 sudo[112304]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:26 compute-0 sudo[112458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmcmwbxetuctlfabryvaqdsboygwaehd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211626.015114-292-84456141641027/AnsiballZ_getent.py'
Feb 27 17:00:26 compute-0 sudo[112458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:26 compute-0 python3.9[112461]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 27 17:00:26 compute-0 sudo[112458]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:27 compute-0 sudo[112612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaksvslkbrcxrdgodwdrhsenthysckge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211626.7705376-300-201239586274911/AnsiballZ_group.py'
Feb 27 17:00:27 compute-0 sudo[112612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:27 compute-0 python3.9[112615]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 27 17:00:27 compute-0 groupadd[112616]: group added to /etc/group: name=libvirt, GID=42473
Feb 27 17:00:27 compute-0 groupadd[112616]: group added to /etc/gshadow: name=libvirt
Feb 27 17:00:27 compute-0 groupadd[112616]: new group: name=libvirt, GID=42473
Feb 27 17:00:27 compute-0 sudo[112612]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:28 compute-0 sudo[112771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmqzeyqhikejoesilaslmnlyojgelhdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211628.0458963-308-187035949712729/AnsiballZ_user.py'
Feb 27 17:00:28 compute-0 sudo[112771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:28 compute-0 python3.9[112774]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 27 17:00:29 compute-0 useradd[112776]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Feb 27 17:00:29 compute-0 sudo[112771]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:30 compute-0 sudo[112932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llqhcrarztrbvhzvmvgydqmsgcesxdij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211629.7792335-319-94070375625746/AnsiballZ_setup.py'
Feb 27 17:00:30 compute-0 sudo[112932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:30 compute-0 python3.9[112935]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 17:00:30 compute-0 sudo[112932]: pam_unix(sudo:session): session closed for user root
Feb 27 17:00:30 compute-0 sudo[113017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzyxhxmerfudgrhgpbgltlkhhpyzhkgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211629.7792335-319-94070375625746/AnsiballZ_dnf.py'
Feb 27 17:00:30 compute-0 sudo[113017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:00:31 compute-0 python3.9[113020]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 17:00:46 compute-0 podman[113205]: 2026-02-27 17:00:46.658283169 +0000 UTC m=+0.062316485 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 27 17:00:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:00:47.071 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:00:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:00:47.072 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:00:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:00:47.072 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:00:48 compute-0 podman[113230]: 2026-02-27 17:00:48.938971619 +0000 UTC m=+0.095434206 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 27 17:00:56 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Feb 27 17:00:56 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 27 17:00:56 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 27 17:00:56 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 27 17:00:56 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 27 17:00:56 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 27 17:00:56 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 27 17:00:56 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 27 17:01:01 compute-0 CROND[113267]: (root) CMD (run-parts /etc/cron.hourly)
Feb 27 17:01:01 compute-0 run-parts[113270]: (/etc/cron.hourly) starting 0anacron
Feb 27 17:01:01 compute-0 anacron[113278]: Anacron started on 2026-02-27
Feb 27 17:01:01 compute-0 run-parts[113280]: (/etc/cron.hourly) finished 0anacron
Feb 27 17:01:01 compute-0 anacron[113278]: Will run job `cron.daily' in 49 min.
Feb 27 17:01:01 compute-0 CROND[113266]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 27 17:01:01 compute-0 anacron[113278]: Will run job `cron.weekly' in 69 min.
Feb 27 17:01:01 compute-0 anacron[113278]: Will run job `cron.monthly' in 89 min.
Feb 27 17:01:01 compute-0 anacron[113278]: Jobs will be executed sequentially
Feb 27 17:01:06 compute-0 kernel: SELinux:  Converting 2767 SID table entries...
Feb 27 17:01:06 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 27 17:01:06 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 27 17:01:06 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 27 17:01:06 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 27 17:01:06 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 27 17:01:06 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 27 17:01:06 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 27 17:01:17 compute-0 dbus-broker-launch[787]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 27 17:01:17 compute-0 podman[113289]: 2026-02-27 17:01:17.690573866 +0000 UTC m=+0.073049634 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 27 17:01:19 compute-0 podman[114807]: 2026-02-27 17:01:19.675216829 +0000 UTC m=+0.081289590 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 27 17:01:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:01:47.072 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:01:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:01:47.072 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:01:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:01:47.073 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:01:48 compute-0 podman[130234]: 2026-02-27 17:01:48.676889468 +0000 UTC m=+0.077372436 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 27 17:01:50 compute-0 podman[130253]: 2026-02-27 17:01:50.682235281 +0000 UTC m=+0.093911428 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 27 17:01:57 compute-0 kernel: SELinux:  Converting 2768 SID table entries...
Feb 27 17:01:57 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Feb 27 17:01:57 compute-0 kernel: SELinux:  policy capability open_perms=1
Feb 27 17:01:57 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Feb 27 17:01:57 compute-0 kernel: SELinux:  policy capability always_check_network=0
Feb 27 17:01:57 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 27 17:01:57 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 27 17:01:57 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 27 17:02:04 compute-0 groupadd[130292]: group added to /etc/group: name=dnsmasq, GID=993
Feb 27 17:02:04 compute-0 groupadd[130292]: group added to /etc/gshadow: name=dnsmasq
Feb 27 17:02:04 compute-0 groupadd[130292]: new group: name=dnsmasq, GID=993
Feb 27 17:02:05 compute-0 useradd[130299]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 27 17:02:05 compute-0 dbus-broker-launch[779]: Noticed file-system modification, trigger reload.
Feb 27 17:02:05 compute-0 dbus-broker-launch[787]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 27 17:02:05 compute-0 dbus-broker-launch[779]: Noticed file-system modification, trigger reload.
Feb 27 17:02:08 compute-0 groupadd[130312]: group added to /etc/group: name=clevis, GID=992
Feb 27 17:02:08 compute-0 groupadd[130312]: group added to /etc/gshadow: name=clevis
Feb 27 17:02:08 compute-0 groupadd[130312]: new group: name=clevis, GID=992
Feb 27 17:02:08 compute-0 useradd[130319]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 27 17:02:08 compute-0 usermod[130329]: add 'clevis' to group 'tss'
Feb 27 17:02:08 compute-0 usermod[130329]: add 'clevis' to shadow group 'tss'
Feb 27 17:02:11 compute-0 polkitd[44568]: Reloading rules
Feb 27 17:02:11 compute-0 polkitd[44568]: Collecting garbage unconditionally...
Feb 27 17:02:11 compute-0 polkitd[44568]: Loading rules from directory /etc/polkit-1/rules.d
Feb 27 17:02:11 compute-0 polkitd[44568]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 27 17:02:11 compute-0 polkitd[44568]: Finished loading, compiling and executing 3 rules
Feb 27 17:02:11 compute-0 polkitd[44568]: Reloading rules
Feb 27 17:02:11 compute-0 polkitd[44568]: Collecting garbage unconditionally...
Feb 27 17:02:11 compute-0 polkitd[44568]: Loading rules from directory /etc/polkit-1/rules.d
Feb 27 17:02:11 compute-0 polkitd[44568]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 27 17:02:11 compute-0 polkitd[44568]: Finished loading, compiling and executing 3 rules
Feb 27 17:02:12 compute-0 groupadd[130519]: group added to /etc/group: name=ceph, GID=167
Feb 27 17:02:12 compute-0 groupadd[130519]: group added to /etc/gshadow: name=ceph
Feb 27 17:02:12 compute-0 groupadd[130519]: new group: name=ceph, GID=167
Feb 27 17:02:12 compute-0 useradd[130525]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 27 17:02:15 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Feb 27 17:02:15 compute-0 sshd[1013]: Received signal 15; terminating.
Feb 27 17:02:15 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Feb 27 17:02:15 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Feb 27 17:02:15 compute-0 systemd[1]: sshd.service: Consumed 4.036s CPU time, read 32.0K from disk, written 48.0K to disk.
Feb 27 17:02:15 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Feb 27 17:02:15 compute-0 systemd[1]: Stopping sshd-keygen.target...
Feb 27 17:02:15 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 27 17:02:15 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 27 17:02:15 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 27 17:02:15 compute-0 systemd[1]: Reached target sshd-keygen.target.
Feb 27 17:02:15 compute-0 systemd[1]: Starting OpenSSH server daemon...
Feb 27 17:02:15 compute-0 sshd[131044]: Server listening on 0.0.0.0 port 22.
Feb 27 17:02:15 compute-0 sshd[131044]: Server listening on :: port 22.
Feb 27 17:02:15 compute-0 systemd[1]: Started OpenSSH server daemon.
Feb 27 17:02:17 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 27 17:02:17 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 27 17:02:17 compute-0 systemd[1]: Reloading.
Feb 27 17:02:17 compute-0 systemd-sysv-generator[131303]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:02:17 compute-0 systemd-rc-local-generator[131295]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:02:17 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 27 17:02:19 compute-0 podman[131321]: 2026-02-27 17:02:19.680987692 +0000 UTC m=+0.080427470 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 27 17:02:21 compute-0 podman[131340]: 2026-02-27 17:02:21.684955662 +0000 UTC m=+0.086182611 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:02:22 compute-0 sudo[113017]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:23 compute-0 sudo[134553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzvvphhwhmwnwxgxaceybfmelgmszqre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211743.1201847-331-87613186571249/AnsiballZ_systemd.py'
Feb 27 17:02:23 compute-0 sudo[134553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:23 compute-0 python3.9[134579]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 27 17:02:23 compute-0 systemd[1]: Reloading.
Feb 27 17:02:24 compute-0 systemd-rc-local-generator[135314]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:02:24 compute-0 systemd-sysv-generator[135322]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:02:24 compute-0 sudo[134553]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:24 compute-0 sudo[136738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvnkietulvheqlxqojcagsietmvwqaxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211744.3710992-331-194025103582869/AnsiballZ_systemd.py'
Feb 27 17:02:24 compute-0 sudo[136738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:24 compute-0 python3.9[136778]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 27 17:02:24 compute-0 systemd[1]: Reloading.
Feb 27 17:02:25 compute-0 systemd-rc-local-generator[137634]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:02:25 compute-0 systemd-sysv-generator[137640]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:02:25 compute-0 sudo[136738]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:25 compute-0 sudo[138737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpxyqtpoetzziyblxarqyzfocktysjlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211745.330702-331-59767261038869/AnsiballZ_systemd.py'
Feb 27 17:02:25 compute-0 sudo[138737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:25 compute-0 python3.9[138766]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 27 17:02:25 compute-0 systemd[1]: Reloading.
Feb 27 17:02:25 compute-0 systemd-rc-local-generator[139487]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:02:25 compute-0 systemd-sysv-generator[139490]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:02:26 compute-0 sudo[138737]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:26 compute-0 sudo[140435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twtfpcmfphwbkavcqpsmmukgiomytltk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211746.2661798-331-42656727186789/AnsiballZ_systemd.py'
Feb 27 17:02:26 compute-0 sudo[140435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:26 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 27 17:02:26 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 27 17:02:26 compute-0 systemd[1]: man-db-cache-update.service: Consumed 6.650s CPU time.
Feb 27 17:02:26 compute-0 systemd[1]: run-rbbc25dd10eb94320a182aae047dc4e45.service: Deactivated successfully.
Feb 27 17:02:26 compute-0 python3.9[140470]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 27 17:02:26 compute-0 systemd[1]: Reloading.
Feb 27 17:02:26 compute-0 systemd-sysv-generator[140536]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:02:26 compute-0 systemd-rc-local-generator[140533]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:02:27 compute-0 sudo[140435]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:27 compute-0 sudo[140697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xahxqicthqpexodvxxheoqdublqribtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211747.3048074-360-27076213798412/AnsiballZ_systemd.py'
Feb 27 17:02:27 compute-0 sudo[140697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:27 compute-0 python3.9[140700]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:27 compute-0 systemd[1]: Reloading.
Feb 27 17:02:28 compute-0 systemd-rc-local-generator[140733]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:02:28 compute-0 systemd-sysv-generator[140736]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:02:28 compute-0 sudo[140697]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:28 compute-0 sudo[140896]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opmphxbqqkxxcpyaoonwtoudnoemvhml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211748.330856-360-169965514368517/AnsiballZ_systemd.py'
Feb 27 17:02:28 compute-0 sudo[140896]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:28 compute-0 python3.9[140899]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:28 compute-0 systemd[1]: Reloading.
Feb 27 17:02:29 compute-0 systemd-sysv-generator[140939]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:02:29 compute-0 systemd-rc-local-generator[140935]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:02:29 compute-0 sudo[140896]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:29 compute-0 sudo[141094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omjycsacuejrbpkvgmctkjkvduyrkcci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211749.3158195-360-4922796943750/AnsiballZ_systemd.py'
Feb 27 17:02:29 compute-0 sudo[141094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:29 compute-0 python3.9[141097]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:29 compute-0 systemd[1]: Reloading.
Feb 27 17:02:30 compute-0 systemd-sysv-generator[141138]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:02:30 compute-0 systemd-rc-local-generator[141134]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:02:30 compute-0 sudo[141094]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:30 compute-0 sudo[141291]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usmzdueyuexncprnpfolwwnaqlahsjuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211750.3895464-360-225287388939585/AnsiballZ_systemd.py'
Feb 27 17:02:30 compute-0 sudo[141291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:31 compute-0 python3.9[141294]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:31 compute-0 sudo[141291]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:31 compute-0 sudo[141447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxhceitdthkznnlnpnwklshyhofjfebd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211751.2434776-360-182065747911966/AnsiballZ_systemd.py'
Feb 27 17:02:31 compute-0 sudo[141447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:32 compute-0 python3.9[141450]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:32 compute-0 systemd[1]: Reloading.
Feb 27 17:02:32 compute-0 systemd-rc-local-generator[141479]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:02:32 compute-0 systemd-sysv-generator[141485]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:02:32 compute-0 sudo[141447]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:32 compute-0 sudo[141645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfxfdkqnhvryycbbmovcamnsnbgrctrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211752.5973818-396-11005648330081/AnsiballZ_systemd.py'
Feb 27 17:02:32 compute-0 sudo[141645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:33 compute-0 python3.9[141648]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 27 17:02:33 compute-0 systemd[1]: Reloading.
Feb 27 17:02:33 compute-0 systemd-sysv-generator[141677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:02:33 compute-0 systemd-rc-local-generator[141673]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:02:33 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Feb 27 17:02:33 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Feb 27 17:02:33 compute-0 sudo[141645]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:34 compute-0 sudo[141846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjqzrcblrstcmvnvdrtzsuvjvxdsfzot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211753.9481049-404-186949207801904/AnsiballZ_systemd.py'
Feb 27 17:02:34 compute-0 sudo[141846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:34 compute-0 python3.9[141849]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:34 compute-0 sudo[141846]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:35 compute-0 sudo[142002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djxxfjmtvkadxfyqzrktpwatjsiignbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211754.9545777-404-81315859380552/AnsiballZ_systemd.py'
Feb 27 17:02:35 compute-0 sudo[142002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:35 compute-0 python3.9[142005]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:35 compute-0 sudo[142002]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:36 compute-0 sudo[142158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vewnacsebmlxifagfjrntmvvqmmdbjoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211755.8460882-404-108701699010786/AnsiballZ_systemd.py'
Feb 27 17:02:36 compute-0 sudo[142158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:36 compute-0 python3.9[142161]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:36 compute-0 sudo[142158]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:36 compute-0 sudo[142314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipffjwuffnqhrgfogddskrkjqyzifywm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211756.6389532-404-28281177712376/AnsiballZ_systemd.py'
Feb 27 17:02:36 compute-0 sudo[142314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:37 compute-0 python3.9[142317]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:37 compute-0 sudo[142314]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:37 compute-0 sudo[142470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwtykfgxhizkxvlehjtvszfqajakogry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211757.3630955-404-90379482273017/AnsiballZ_systemd.py'
Feb 27 17:02:37 compute-0 sudo[142470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:37 compute-0 python3.9[142473]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:38 compute-0 sudo[142470]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:39 compute-0 sudo[142626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpyxgiclonbyukbzkkxpzoxkmwtlztii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211759.1066508-404-254614712605728/AnsiballZ_systemd.py'
Feb 27 17:02:39 compute-0 sudo[142626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:39 compute-0 python3.9[142629]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:39 compute-0 sudo[142626]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:40 compute-0 sudo[142782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yppksjqbsmfrpekzaudtnwetaemljlom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211759.872554-404-226406034016891/AnsiballZ_systemd.py'
Feb 27 17:02:40 compute-0 sudo[142782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:40 compute-0 python3.9[142785]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:40 compute-0 sudo[142782]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:40 compute-0 sudo[142938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izfasvqbbotxupswmpjhpfnwcnscnkhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211760.6547642-404-101012320425474/AnsiballZ_systemd.py'
Feb 27 17:02:40 compute-0 sudo[142938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:41 compute-0 python3.9[142941]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:42 compute-0 sudo[142938]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:42 compute-0 sudo[143094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tabtsdyizmguqjwunjtskaasfnpncqsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211762.514562-404-132462341408245/AnsiballZ_systemd.py'
Feb 27 17:02:42 compute-0 sudo[143094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:43 compute-0 python3.9[143097]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:43 compute-0 sudo[143094]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:43 compute-0 sudo[143250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjiapybgwfkuxxgzjgrkfeaklqnnvvsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211763.321725-404-165786662520996/AnsiballZ_systemd.py'
Feb 27 17:02:43 compute-0 sudo[143250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:43 compute-0 python3.9[143253]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:43 compute-0 sudo[143250]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:44 compute-0 sudo[143406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifinfhhphfyyimcsnvmhkptkezqgpxnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211764.0398662-404-89896553193877/AnsiballZ_systemd.py'
Feb 27 17:02:44 compute-0 sudo[143406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:44 compute-0 python3.9[143409]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:44 compute-0 sudo[143406]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:45 compute-0 sudo[143562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rotsyczgmulcingvzqqmrjdddzvldgvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211764.7352383-404-124875766456950/AnsiballZ_systemd.py'
Feb 27 17:02:45 compute-0 sudo[143562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:45 compute-0 python3.9[143565]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:45 compute-0 sudo[143562]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:45 compute-0 sudo[143718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftdiiddfqcpkzqouvqkwrtgmfmkwahlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211765.46858-404-188172004568139/AnsiballZ_systemd.py'
Feb 27 17:02:45 compute-0 sudo[143718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:46 compute-0 python3.9[143721]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:46 compute-0 sudo[143718]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:46 compute-0 sudo[143874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwvjstvthqpioetdtlunwiarolpmyyvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211766.2349622-404-166882008400729/AnsiballZ_systemd.py'
Feb 27 17:02:46 compute-0 sudo[143874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:46 compute-0 python3.9[143877]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 27 17:02:46 compute-0 sudo[143874]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:02:47.073 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:02:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:02:47.074 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:02:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:02:47.074 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:02:47 compute-0 sudo[144030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqpwssyuedxtrnwzcailxuhkvrlplboh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211767.2178981-506-84033356302755/AnsiballZ_file.py'
Feb 27 17:02:47 compute-0 sudo[144030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:47 compute-0 python3.9[144033]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:02:47 compute-0 sudo[144030]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:48 compute-0 sudo[144183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lleudwtfuhpmdaqcavlzftwnmrfesrna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211767.8982866-506-14476868294232/AnsiballZ_file.py'
Feb 27 17:02:48 compute-0 sudo[144183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:48 compute-0 python3.9[144186]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:02:48 compute-0 sudo[144183]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:49 compute-0 sudo[144336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejazuazlxvmhwplhgrliqarmlcvrolqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211769.2409832-506-43966312985907/AnsiballZ_file.py'
Feb 27 17:02:49 compute-0 sudo[144336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:49 compute-0 python3.9[144339]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:02:49 compute-0 sudo[144336]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:49 compute-0 podman[144340]: 2026-02-27 17:02:49.869019855 +0000 UTC m=+0.070044867 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 27 17:02:50 compute-0 sudo[144506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bssnxnkngfjpfraskrorkdhcyxnlkmyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211769.8845458-506-3073948535119/AnsiballZ_file.py'
Feb 27 17:02:50 compute-0 sudo[144506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:50 compute-0 python3.9[144509]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:02:50 compute-0 sudo[144506]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:50 compute-0 sudo[144659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojddievmcxwoaswiwwqulnocbhyrqfcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211770.5008028-506-182521574842229/AnsiballZ_file.py'
Feb 27 17:02:50 compute-0 sudo[144659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:50 compute-0 python3.9[144662]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:02:50 compute-0 sudo[144659]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:51 compute-0 sudo[144812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bffsgbqxobtyimvlavnvazzofntuhspx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211771.0566978-506-88567320487467/AnsiballZ_file.py'
Feb 27 17:02:51 compute-0 sudo[144812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:51 compute-0 python3.9[144815]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:02:51 compute-0 sudo[144812]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:52 compute-0 podman[144939]: 2026-02-27 17:02:52.149105753 +0000 UTC m=+0.096267688 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:02:52 compute-0 python3.9[144978]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 17:02:52 compute-0 sudo[145141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqkuftyusoaeelhybwbiljbqnvtkgdis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211772.5618799-557-11394662552810/AnsiballZ_stat.py'
Feb 27 17:02:52 compute-0 sudo[145141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:53 compute-0 python3.9[145144]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:02:53 compute-0 sudo[145141]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:53 compute-0 sudo[145267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tertnbeocahqzbeiqhiybcoufcbikvhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211772.5618799-557-11394662552810/AnsiballZ_copy.py'
Feb 27 17:02:53 compute-0 sudo[145267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:53 compute-0 python3.9[145270]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772211772.5618799-557-11394662552810/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:02:53 compute-0 sudo[145267]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:54 compute-0 sudo[145420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caxkqojmmnomzcllrhzggaqhtokcespg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211774.0869176-557-161512728105312/AnsiballZ_stat.py'
Feb 27 17:02:54 compute-0 sudo[145420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:54 compute-0 python3.9[145423]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:02:54 compute-0 sudo[145420]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:54 compute-0 sudo[145546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxbzodtezkqrshkfzivtydzwbwhlzgka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211774.0869176-557-161512728105312/AnsiballZ_copy.py'
Feb 27 17:02:54 compute-0 sudo[145546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:55 compute-0 python3.9[145549]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772211774.0869176-557-161512728105312/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:02:55 compute-0 sudo[145546]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:55 compute-0 sudo[145699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nixyqgrzjgszkhhjlyyqmsdxsssgadsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211775.3013628-557-93238523898114/AnsiballZ_stat.py'
Feb 27 17:02:55 compute-0 sudo[145699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:55 compute-0 python3.9[145702]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:02:55 compute-0 sudo[145699]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:56 compute-0 sudo[145825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxljxuizvmcmaxelkbaqsfuqqcerkthz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211775.3013628-557-93238523898114/AnsiballZ_copy.py'
Feb 27 17:02:56 compute-0 sudo[145825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:56 compute-0 python3.9[145828]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772211775.3013628-557-93238523898114/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:02:56 compute-0 sudo[145825]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:56 compute-0 sudo[145978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnsraydajluvxqsicxxazboklbyximnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211776.4579594-557-6515114320876/AnsiballZ_stat.py'
Feb 27 17:02:56 compute-0 sudo[145978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:56 compute-0 python3.9[145981]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:02:56 compute-0 sudo[145978]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:57 compute-0 sudo[146104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvtxpmsfvjjjvvisnixtwappcnqovhho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211776.4579594-557-6515114320876/AnsiballZ_copy.py'
Feb 27 17:02:57 compute-0 sudo[146104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:57 compute-0 python3.9[146107]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772211776.4579594-557-6515114320876/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:02:57 compute-0 sudo[146104]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:57 compute-0 sudo[146257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dczpvlhpxwygplzkpknbwowgexdgttcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211777.560733-557-162326150908165/AnsiballZ_stat.py'
Feb 27 17:02:57 compute-0 sudo[146257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:57 compute-0 python3.9[146260]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:02:58 compute-0 sudo[146257]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:58 compute-0 sudo[146383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxgkntzhtrytbhrlzjlrtltdpaqaybec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211777.560733-557-162326150908165/AnsiballZ_copy.py'
Feb 27 17:02:58 compute-0 sudo[146383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:58 compute-0 python3.9[146386]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772211777.560733-557-162326150908165/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:02:58 compute-0 sudo[146383]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:58 compute-0 sudo[146536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udgxtbxzfxtyusvrcloppbjfxyvgqvxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211778.633646-557-87704212314627/AnsiballZ_stat.py'
Feb 27 17:02:58 compute-0 sudo[146536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:59 compute-0 python3.9[146539]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:02:59 compute-0 sudo[146536]: pam_unix(sudo:session): session closed for user root
Feb 27 17:02:59 compute-0 sudo[146662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqaeobahympxjzsilcgejmcvedrbaggr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211778.633646-557-87704212314627/AnsiballZ_copy.py'
Feb 27 17:02:59 compute-0 sudo[146662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:02:59 compute-0 python3.9[146665]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772211778.633646-557-87704212314627/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:02:59 compute-0 sudo[146662]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:00 compute-0 sudo[146815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uildbquxclmldmdkanttywyqsjypomnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211779.7787576-557-69220000976614/AnsiballZ_stat.py'
Feb 27 17:03:00 compute-0 sudo[146815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:00 compute-0 python3.9[146818]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:00 compute-0 sudo[146815]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:00 compute-0 sudo[146939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmsgbnaxqjvbfdiptdjybibzvwjehtoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211779.7787576-557-69220000976614/AnsiballZ_copy.py'
Feb 27 17:03:00 compute-0 sudo[146939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:00 compute-0 python3.9[146942]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772211779.7787576-557-69220000976614/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:00 compute-0 sudo[146939]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:01 compute-0 sudo[147092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkbnuaqqeuoqfrydmefywkikwdtmmwhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211780.9324863-557-39147149666285/AnsiballZ_stat.py'
Feb 27 17:03:01 compute-0 sudo[147092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:01 compute-0 python3.9[147095]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:01 compute-0 sudo[147092]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:01 compute-0 sudo[147218]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utpgewbnveufbqyqdvdnqktxppustfdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211780.9324863-557-39147149666285/AnsiballZ_copy.py'
Feb 27 17:03:01 compute-0 sudo[147218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:02 compute-0 python3.9[147221]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1772211780.9324863-557-39147149666285/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:02 compute-0 sudo[147218]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:02 compute-0 sudo[147371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrjiobfmvrvqnwfjcexpgqizoeyzxvng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211782.330472-670-256344758166332/AnsiballZ_command.py'
Feb 27 17:03:02 compute-0 sudo[147371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:02 compute-0 python3.9[147374]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Feb 27 17:03:02 compute-0 sudo[147371]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:03 compute-0 sudo[147525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruoeuqntpwjudcdglclvfyofidrmcmit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211783.106299-679-207386155535848/AnsiballZ_file.py'
Feb 27 17:03:03 compute-0 sudo[147525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:03 compute-0 python3.9[147528]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:03 compute-0 sudo[147525]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:04 compute-0 sudo[147678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wavpqvhalpuppfvleobjkdudtvlqwxzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211783.7778876-679-13815390011484/AnsiballZ_file.py'
Feb 27 17:03:04 compute-0 sudo[147678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:04 compute-0 python3.9[147681]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:04 compute-0 sudo[147678]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:04 compute-0 sudo[147831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhxduiooyumusdtxmjeodhexijfibcfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211784.4122136-679-139484271622409/AnsiballZ_file.py'
Feb 27 17:03:04 compute-0 sudo[147831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:04 compute-0 python3.9[147834]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:04 compute-0 sudo[147831]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:05 compute-0 sudo[147984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ectodbvwhvbabyultjapmtgnjixqmacg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211785.0116649-679-1513732521985/AnsiballZ_file.py'
Feb 27 17:03:05 compute-0 sudo[147984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:05 compute-0 python3.9[147987]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:05 compute-0 sudo[147984]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:05 compute-0 sudo[148137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-valdiazrzcwfquzhcnhophcjpjqkouzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211785.6840825-679-145142647138491/AnsiballZ_file.py'
Feb 27 17:03:05 compute-0 sudo[148137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:06 compute-0 python3.9[148140]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:06 compute-0 sudo[148137]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:06 compute-0 sudo[148290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djvhqghalwbnfodfyxzwpgejrvexsosl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211786.3206577-679-129057227749150/AnsiballZ_file.py'
Feb 27 17:03:06 compute-0 sudo[148290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:06 compute-0 python3.9[148293]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:06 compute-0 sudo[148290]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:07 compute-0 sudo[148443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpsnjnlbgmukyhsnontqocuouenkfblh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211786.9499013-679-68955719838323/AnsiballZ_file.py'
Feb 27 17:03:07 compute-0 sudo[148443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:07 compute-0 python3.9[148446]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:07 compute-0 sudo[148443]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:07 compute-0 sudo[148596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-quqkdijjfovcqbdclldsdmiyvvybbolc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211787.5805802-679-209404146422500/AnsiballZ_file.py'
Feb 27 17:03:07 compute-0 sudo[148596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:08 compute-0 python3.9[148599]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:08 compute-0 sudo[148596]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:08 compute-0 sudo[148749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffjtqcttfmrdwfoypoywfokuisoixwkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211788.1706557-679-218311976587829/AnsiballZ_file.py'
Feb 27 17:03:08 compute-0 sudo[148749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:08 compute-0 python3.9[148752]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:08 compute-0 sudo[148749]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:09 compute-0 sudo[148902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auivurivitluhkcpadsqkzhiejrebqga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211788.7871468-679-55791530701746/AnsiballZ_file.py'
Feb 27 17:03:09 compute-0 sudo[148902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:09 compute-0 python3.9[148905]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:09 compute-0 sudo[148902]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:09 compute-0 sudo[149055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-memfevahwgltmjkjefslbbbhwrxhxwwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211789.3940535-679-248673078812784/AnsiballZ_file.py'
Feb 27 17:03:09 compute-0 sudo[149055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:09 compute-0 python3.9[149058]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:09 compute-0 sudo[149055]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:10 compute-0 sudo[149208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnwsbslbhxbyqibvvljhaboiidrwnnei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211790.0356815-679-245266525419006/AnsiballZ_file.py'
Feb 27 17:03:10 compute-0 sudo[149208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:10 compute-0 python3.9[149211]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:10 compute-0 sudo[149208]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:10 compute-0 sudo[149361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkydktecblwmwhtfqrdcebnnbkjqqxpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211790.6797578-679-273587989269450/AnsiballZ_file.py'
Feb 27 17:03:10 compute-0 sudo[149361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:11 compute-0 python3.9[149364]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:11 compute-0 sudo[149361]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:11 compute-0 sudo[149514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkammemsdkhguumzkkxfbccwxuhmchrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211791.2875092-679-181805044587447/AnsiballZ_file.py'
Feb 27 17:03:11 compute-0 sudo[149514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:11 compute-0 python3.9[149517]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:11 compute-0 sudo[149514]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:12 compute-0 sudo[149667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymfxetfvoympgimkcwqmlookqtjinqqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211791.9547768-778-138043750038872/AnsiballZ_stat.py'
Feb 27 17:03:12 compute-0 sudo[149667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:12 compute-0 python3.9[149670]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:12 compute-0 sudo[149667]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:12 compute-0 sudo[149791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqruvmukwjvdqwqkpkxrwhzlnfdmxthd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211791.9547768-778-138043750038872/AnsiballZ_copy.py'
Feb 27 17:03:12 compute-0 sudo[149791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:13 compute-0 python3.9[149794]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211791.9547768-778-138043750038872/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:13 compute-0 sudo[149791]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:13 compute-0 sudo[149944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yctqdsgyhgofaoywipdgxjvzvfltjkcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211793.3270583-778-51778277238389/AnsiballZ_stat.py'
Feb 27 17:03:13 compute-0 sudo[149944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:13 compute-0 python3.9[149947]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:13 compute-0 sudo[149944]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:14 compute-0 sudo[150068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbxfgjwpfrpsvwkwmeclrxyklibcqmxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211793.3270583-778-51778277238389/AnsiballZ_copy.py'
Feb 27 17:03:14 compute-0 sudo[150068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:14 compute-0 python3.9[150071]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211793.3270583-778-51778277238389/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:14 compute-0 sudo[150068]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:14 compute-0 sudo[150221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvquuohooalhyzfgnsbfrwdfdaqjwniw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211794.4982462-778-36631532542426/AnsiballZ_stat.py'
Feb 27 17:03:14 compute-0 sudo[150221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:14 compute-0 python3.9[150224]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:14 compute-0 sudo[150221]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:15 compute-0 sudo[150345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huhebljfjspdstelanorkjdjnwpasgjx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211794.4982462-778-36631532542426/AnsiballZ_copy.py'
Feb 27 17:03:15 compute-0 sudo[150345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:15 compute-0 python3.9[150348]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211794.4982462-778-36631532542426/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:15 compute-0 sudo[150345]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:15 compute-0 sudo[150498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtzldxsfsyhicybgrnqltrtuhymidkes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211795.6425364-778-244742213033424/AnsiballZ_stat.py'
Feb 27 17:03:15 compute-0 sudo[150498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:16 compute-0 python3.9[150501]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:16 compute-0 sudo[150498]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:16 compute-0 sudo[150622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxtaohkqjqatvvcckmzooiawfgfcomne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211795.6425364-778-244742213033424/AnsiballZ_copy.py'
Feb 27 17:03:16 compute-0 sudo[150622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:16 compute-0 python3.9[150625]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211795.6425364-778-244742213033424/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:16 compute-0 sudo[150622]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:17 compute-0 sudo[150775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igqfdepidrejhqdrxfmjboafngunytfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211796.9040651-778-189491595190169/AnsiballZ_stat.py'
Feb 27 17:03:17 compute-0 sudo[150775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:17 compute-0 python3.9[150778]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:17 compute-0 sudo[150775]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:17 compute-0 sudo[150899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alsgichvdpwrxohincputcaftowwvrik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211796.9040651-778-189491595190169/AnsiballZ_copy.py'
Feb 27 17:03:17 compute-0 sudo[150899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:17 compute-0 python3.9[150902]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211796.9040651-778-189491595190169/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:17 compute-0 sudo[150899]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:18 compute-0 sudo[151052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyozekdxjxdyadkuhbaywgjpfxswxink ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211798.0834565-778-204329286598984/AnsiballZ_stat.py'
Feb 27 17:03:18 compute-0 sudo[151052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:18 compute-0 python3.9[151055]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:18 compute-0 sudo[151052]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:18 compute-0 sudo[151176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjucdkmqqaoxrmouqecsvzngfwpbnnfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211798.0834565-778-204329286598984/AnsiballZ_copy.py'
Feb 27 17:03:18 compute-0 sudo[151176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:19 compute-0 python3.9[151179]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211798.0834565-778-204329286598984/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:19 compute-0 sudo[151176]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:19 compute-0 sudo[151329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvcetldsxrffcdfqyzzppeujqckathfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211799.3663764-778-30142845756018/AnsiballZ_stat.py'
Feb 27 17:03:19 compute-0 sudo[151329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:20 compute-0 python3.9[151332]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:20 compute-0 sudo[151329]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:20 compute-0 podman[151333]: 2026-02-27 17:03:20.141366266 +0000 UTC m=+0.082438976 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 27 17:03:20 compute-0 sudo[151474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrrecuvqdorezwwpyhnfhwhnlirpehlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211799.3663764-778-30142845756018/AnsiballZ_copy.py'
Feb 27 17:03:20 compute-0 sudo[151474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:20 compute-0 python3.9[151477]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211799.3663764-778-30142845756018/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:20 compute-0 sudo[151474]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:21 compute-0 sudo[151627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sphwlwvuxiaxkzuuktkbhpycxudztpub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211800.788112-778-226678731250574/AnsiballZ_stat.py'
Feb 27 17:03:21 compute-0 sudo[151627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:21 compute-0 python3.9[151630]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:21 compute-0 sudo[151627]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:21 compute-0 sudo[151751]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzlaxgmspbewjrdhmxsoqvrprhmvobhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211800.788112-778-226678731250574/AnsiballZ_copy.py'
Feb 27 17:03:21 compute-0 sudo[151751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:21 compute-0 python3.9[151754]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211800.788112-778-226678731250574/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:21 compute-0 sudo[151751]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:22 compute-0 sudo[151917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvjouzjyovjzkvnglubzqvfvrpkowsxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211802.038039-778-25490521240385/AnsiballZ_stat.py'
Feb 27 17:03:22 compute-0 sudo[151917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:22 compute-0 podman[151878]: 2026-02-27 17:03:22.438324856 +0000 UTC m=+0.110845966 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260223)
Feb 27 17:03:22 compute-0 python3.9[151924]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:22 compute-0 sudo[151917]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:22 compute-0 sudo[152054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlzlftbrfvfayvswkccvsaitpuchuguu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211802.038039-778-25490521240385/AnsiballZ_copy.py'
Feb 27 17:03:22 compute-0 sudo[152054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:23 compute-0 python3.9[152057]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211802.038039-778-25490521240385/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:23 compute-0 sudo[152054]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:23 compute-0 sudo[152207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbicjbtrqkbhedyiaotahehzlrbfrlpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211803.3237138-778-149389272860101/AnsiballZ_stat.py'
Feb 27 17:03:23 compute-0 sudo[152207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:23 compute-0 python3.9[152210]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:23 compute-0 sudo[152207]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:24 compute-0 sudo[152331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwxnpqecocnrpahrdkzzlcakfqalebhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211803.3237138-778-149389272860101/AnsiballZ_copy.py'
Feb 27 17:03:24 compute-0 sudo[152331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:24 compute-0 python3.9[152334]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211803.3237138-778-149389272860101/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:24 compute-0 sudo[152331]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:24 compute-0 sudo[152484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bquzfsxbfvvaxsgxdlutxaqfbxqcdoxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211804.613738-778-6012300227476/AnsiballZ_stat.py'
Feb 27 17:03:24 compute-0 sudo[152484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:25 compute-0 python3.9[152487]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:25 compute-0 sudo[152484]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:25 compute-0 sudo[152608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkwfnmwjspklexqxdrianhecvzbppjez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211804.613738-778-6012300227476/AnsiballZ_copy.py'
Feb 27 17:03:25 compute-0 sudo[152608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:25 compute-0 python3.9[152611]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211804.613738-778-6012300227476/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:25 compute-0 sudo[152608]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:26 compute-0 sudo[152761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmmhtnekpzjrudndosuhwjyvuffgjdln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211805.8070374-778-108933003647693/AnsiballZ_stat.py'
Feb 27 17:03:26 compute-0 sudo[152761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:26 compute-0 python3.9[152764]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:26 compute-0 sudo[152761]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:26 compute-0 sudo[152885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghwfbroicgxkrmbtexejljktzwaoljke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211805.8070374-778-108933003647693/AnsiballZ_copy.py'
Feb 27 17:03:26 compute-0 sudo[152885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:26 compute-0 python3.9[152888]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211805.8070374-778-108933003647693/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:26 compute-0 sudo[152885]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:27 compute-0 sudo[153038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nprpphkgswtaqqgzzrkkwhixlncpcnbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211807.004648-778-119475556888735/AnsiballZ_stat.py'
Feb 27 17:03:27 compute-0 sudo[153038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:27 compute-0 python3.9[153041]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:27 compute-0 sudo[153038]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:27 compute-0 sudo[153162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtujxrtphbcatzfdkebshdzdxmsokrea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211807.004648-778-119475556888735/AnsiballZ_copy.py'
Feb 27 17:03:28 compute-0 sudo[153162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:28 compute-0 python3.9[153165]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211807.004648-778-119475556888735/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:28 compute-0 sudo[153162]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:28 compute-0 sudo[153315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eijpwbicmzalanrsywcskfstvodbrsbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211808.3488932-778-186398337203543/AnsiballZ_stat.py'
Feb 27 17:03:28 compute-0 sudo[153315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:28 compute-0 python3.9[153318]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:28 compute-0 sudo[153315]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:29 compute-0 sudo[153439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oynjkmluvacxhpvvahbktpxsqbnsmnqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211808.3488932-778-186398337203543/AnsiballZ_copy.py'
Feb 27 17:03:29 compute-0 sudo[153439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:29 compute-0 python3.9[153442]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211808.3488932-778-186398337203543/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:29 compute-0 sudo[153439]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:29 compute-0 python3.9[153592]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:03:30 compute-0 sudo[153745]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivvqhwyxyhagtxtjtccogjecgzgjhfwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211810.1161802-984-91655130665580/AnsiballZ_seboolean.py'
Feb 27 17:03:30 compute-0 sudo[153745]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:30 compute-0 python3.9[153748]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 27 17:03:32 compute-0 sudo[153745]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:32 compute-0 sudo[153902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzpmfumfxzqkpnxripxbwhdeqqgqfeaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211812.1968958-992-243430493084983/AnsiballZ_copy.py'
Feb 27 17:03:32 compute-0 dbus-broker-launch[787]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 27 17:03:32 compute-0 sudo[153902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:32 compute-0 python3.9[153905]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:32 compute-0 sudo[153902]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:33 compute-0 sudo[154055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsunmuepfaixvxsgwdwaicnebockelvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211812.8104424-992-189462734547334/AnsiballZ_copy.py'
Feb 27 17:03:33 compute-0 sudo[154055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:33 compute-0 python3.9[154058]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:33 compute-0 sudo[154055]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:33 compute-0 sudo[154208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwckxyfzfvmsehvkqnhthstvimjyafss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211813.3955264-992-122798131915714/AnsiballZ_copy.py'
Feb 27 17:03:33 compute-0 sudo[154208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:33 compute-0 python3.9[154211]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:33 compute-0 sudo[154208]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:34 compute-0 sudo[154361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eagorsmswnqbzxjsgnwdmerjaigrzrec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211814.0339494-992-216874809746704/AnsiballZ_copy.py'
Feb 27 17:03:34 compute-0 sudo[154361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:34 compute-0 python3.9[154364]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:34 compute-0 sudo[154361]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:34 compute-0 sudo[154514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlthgapyxobszeirvkqkpkzuimsuhuxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211814.6622944-992-24619393559719/AnsiballZ_copy.py'
Feb 27 17:03:34 compute-0 sudo[154514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:35 compute-0 python3.9[154517]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:35 compute-0 sudo[154514]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:35 compute-0 sudo[154667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvqiqsgqjpwdhihekstyrdvkhtqicesl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211815.243241-1028-10514831496079/AnsiballZ_copy.py'
Feb 27 17:03:35 compute-0 sudo[154667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:35 compute-0 python3.9[154670]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:35 compute-0 sudo[154667]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:36 compute-0 sudo[154820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubkxcunndacuuedckesafxpcpdzopcoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211815.8233032-1028-230983688118591/AnsiballZ_copy.py'
Feb 27 17:03:36 compute-0 sudo[154820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:36 compute-0 python3.9[154823]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:36 compute-0 sudo[154820]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:36 compute-0 sudo[154973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hndmshohdhsphbebewzekwfkytxxyfsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211816.5319326-1028-130663709234290/AnsiballZ_copy.py'
Feb 27 17:03:36 compute-0 sudo[154973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:36 compute-0 python3.9[154976]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:36 compute-0 sudo[154973]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:37 compute-0 sudo[155126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syayzgdldqtfbafxkfiejassqvursymy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211817.1406045-1028-24380359322870/AnsiballZ_copy.py'
Feb 27 17:03:37 compute-0 sudo[155126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:37 compute-0 python3.9[155129]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:37 compute-0 sudo[155126]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:38 compute-0 sudo[155279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lklpchpwmtbztioexjbymamviudycshb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211817.804251-1028-196977267134831/AnsiballZ_copy.py'
Feb 27 17:03:38 compute-0 sudo[155279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:38 compute-0 python3.9[155282]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:38 compute-0 sudo[155279]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:38 compute-0 sudo[155432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jblqyxcblircvpiiktdqochzylxhbfhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211818.5270746-1064-49342168263946/AnsiballZ_systemd.py'
Feb 27 17:03:38 compute-0 sudo[155432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:39 compute-0 python3.9[155435]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 17:03:39 compute-0 systemd[1]: Reloading.
Feb 27 17:03:39 compute-0 systemd-rc-local-generator[155458]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:03:39 compute-0 systemd-sysv-generator[155465]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:03:39 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Feb 27 17:03:39 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Feb 27 17:03:39 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Feb 27 17:03:39 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 27 17:03:39 compute-0 systemd[1]: Starting libvirt logging daemon...
Feb 27 17:03:39 compute-0 systemd[1]: Started libvirt logging daemon.
Feb 27 17:03:39 compute-0 sudo[155432]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:39 compute-0 sudo[155632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejimhljkogpijwkieooomnubugiousph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211819.7019758-1064-73334268301583/AnsiballZ_systemd.py'
Feb 27 17:03:39 compute-0 sudo[155632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:40 compute-0 python3.9[155635]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 17:03:40 compute-0 systemd[1]: Reloading.
Feb 27 17:03:40 compute-0 systemd-rc-local-generator[155660]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:03:40 compute-0 systemd-sysv-generator[155664]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:03:40 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Feb 27 17:03:40 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 27 17:03:40 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 27 17:03:40 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 27 17:03:40 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 27 17:03:40 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 27 17:03:40 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 27 17:03:40 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 27 17:03:40 compute-0 sudo[155632]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:41 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 27 17:03:41 compute-0 sudo[155858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssonogtksquzkrsewnwtmbqiswvuaxqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211820.7787845-1064-223814047802297/AnsiballZ_systemd.py'
Feb 27 17:03:41 compute-0 sudo[155858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:41 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 27 17:03:41 compute-0 python3.9[155861]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 17:03:41 compute-0 systemd[1]: Reloading.
Feb 27 17:03:41 compute-0 systemd-rc-local-generator[155889]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:03:41 compute-0 systemd-sysv-generator[155892]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:03:41 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 27 17:03:41 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 27 17:03:41 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 27 17:03:41 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 27 17:03:41 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 27 17:03:41 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 27 17:03:41 compute-0 systemd[1]: Starting libvirt proxy daemon...
Feb 27 17:03:41 compute-0 systemd[1]: Started libvirt proxy daemon.
Feb 27 17:03:41 compute-0 sudo[155858]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:42 compute-0 sudo[156085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkmluhjppbnqloxgqtcdwrvroneqqpxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211821.985445-1064-148373986709754/AnsiballZ_systemd.py'
Feb 27 17:03:42 compute-0 sudo[156085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:42 compute-0 python3.9[156088]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 17:03:42 compute-0 systemd[1]: Reloading.
Feb 27 17:03:42 compute-0 setroubleshoot[155811]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 34d122bf-0f77-4a9d-b4ec-4ec3500f6b64
Feb 27 17:03:42 compute-0 setroubleshoot[155811]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 27 17:03:42 compute-0 setroubleshoot[155811]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 34d122bf-0f77-4a9d-b4ec-4ec3500f6b64
Feb 27 17:03:42 compute-0 rsyslogd[1012]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 27 17:03:42 compute-0 rsyslogd[1012]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 27 17:03:42 compute-0 setroubleshoot[155811]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Feb 27 17:03:42 compute-0 systemd-sysv-generator[156117]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:03:42 compute-0 systemd-rc-local-generator[156114]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:03:42 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Feb 27 17:03:42 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Feb 27 17:03:42 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 27 17:03:42 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 27 17:03:42 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 27 17:03:42 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 27 17:03:42 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 27 17:03:42 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 27 17:03:42 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 27 17:03:42 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 27 17:03:42 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 27 17:03:42 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 27 17:03:43 compute-0 sudo[156085]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:43 compute-0 sudo[156309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzeamjntvbyqeoxbbvsubjjkbinghuzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211823.1768415-1064-52108676081349/AnsiballZ_systemd.py'
Feb 27 17:03:43 compute-0 sudo[156309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:43 compute-0 python3.9[156312]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 17:03:43 compute-0 systemd[1]: Reloading.
Feb 27 17:03:43 compute-0 systemd-rc-local-generator[156341]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:03:43 compute-0 systemd-sysv-generator[156346]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:03:44 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Feb 27 17:03:44 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Feb 27 17:03:44 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Feb 27 17:03:44 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 27 17:03:44 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 27 17:03:44 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 27 17:03:44 compute-0 systemd[1]: Starting libvirt secret daemon...
Feb 27 17:03:44 compute-0 systemd[1]: Started libvirt secret daemon.
Feb 27 17:03:44 compute-0 sudo[156309]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:44 compute-0 sudo[156529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olsexqcjfbvyiaqduixkgvjlfchkwkgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211824.5004973-1101-170975107433284/AnsiballZ_file.py'
Feb 27 17:03:44 compute-0 sudo[156529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:44 compute-0 python3.9[156532]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:44 compute-0 sudo[156529]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:45 compute-0 sudo[156682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzcrpjsywtpvtodvtpuqhmabyoezlvhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211825.1002266-1109-189576089479277/AnsiballZ_find.py'
Feb 27 17:03:45 compute-0 sudo[156682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:45 compute-0 python3.9[156685]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 27 17:03:45 compute-0 sudo[156682]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:46 compute-0 sudo[156835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-horvblsnymdxkwyrkbyyoqtytdsucokq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211826.0436187-1123-111919597950183/AnsiballZ_stat.py'
Feb 27 17:03:46 compute-0 sudo[156835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:46 compute-0 python3.9[156838]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:46 compute-0 sudo[156835]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:46 compute-0 sudo[156959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xywhbqpsjvvfnsbvslzbejuvlnislfkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211826.0436187-1123-111919597950183/AnsiballZ_copy.py'
Feb 27 17:03:46 compute-0 sudo[156959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:46 compute-0 python3.9[156962]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211826.0436187-1123-111919597950183/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:46 compute-0 sudo[156959]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:03:47.074 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:03:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:03:47.075 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:03:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:03:47.075 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:03:47 compute-0 sudo[157112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvdeuooqwjsdciemejuzxfyexrnzsuhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211827.310995-1139-6638767321158/AnsiballZ_file.py'
Feb 27 17:03:47 compute-0 sudo[157112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:47 compute-0 python3.9[157115]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:47 compute-0 sudo[157112]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:48 compute-0 sudo[157265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzvyplehwznjoclrehlzrhtspeqejftd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211828.0205076-1147-96411417518658/AnsiballZ_stat.py'
Feb 27 17:03:48 compute-0 sudo[157265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:48 compute-0 python3.9[157268]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:48 compute-0 sudo[157265]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:48 compute-0 sudo[157344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eklaflxdowaetflntmealgomssgjkhkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211828.0205076-1147-96411417518658/AnsiballZ_file.py'
Feb 27 17:03:49 compute-0 sudo[157344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:49 compute-0 python3.9[157347]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:49 compute-0 sudo[157344]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:49 compute-0 sudo[157497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igzzctnonvhotngsfeavwvwoawmesulx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211829.487648-1159-16288801965626/AnsiballZ_stat.py'
Feb 27 17:03:49 compute-0 sudo[157497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:49 compute-0 python3.9[157500]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:49 compute-0 sudo[157497]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:50 compute-0 sudo[157576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umbblssahoexyakgbdaneetrhofkcfxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211829.487648-1159-16288801965626/AnsiballZ_file.py'
Feb 27 17:03:50 compute-0 sudo[157576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:50 compute-0 python3.9[157579]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.yovfmhl4 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:50 compute-0 sudo[157576]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:50 compute-0 podman[157580]: 2026-02-27 17:03:50.40765899 +0000 UTC m=+0.053955704 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:03:50 compute-0 sudo[157749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxpjroedrmtkdaisvfkbpuggkwknjalg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211830.5712714-1171-18349725542477/AnsiballZ_stat.py'
Feb 27 17:03:50 compute-0 sudo[157749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:51 compute-0 python3.9[157752]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:51 compute-0 sudo[157749]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:51 compute-0 sudo[157828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-biovhbsairdhghcznbvkgmdkimyddaac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211830.5712714-1171-18349725542477/AnsiballZ_file.py'
Feb 27 17:03:51 compute-0 sudo[157828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:51 compute-0 python3.9[157831]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:51 compute-0 sudo[157828]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:52 compute-0 sudo[157981]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyifzhrfljhzfmbpjdzvtynrirznjsgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211832.0040407-1184-219536035913081/AnsiballZ_command.py'
Feb 27 17:03:52 compute-0 sudo[157981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:52 compute-0 python3.9[157984]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:03:52 compute-0 sudo[157981]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:52 compute-0 podman[157986]: 2026-02-27 17:03:52.672873053 +0000 UTC m=+0.094076617 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 27 17:03:52 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 27 17:03:52 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 27 17:03:53 compute-0 sudo[158158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvzcjslpuhnudwklfskhpapivsndbyjy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772211832.7461379-1192-1287105830811/AnsiballZ_edpm_nftables_from_files.py'
Feb 27 17:03:53 compute-0 sudo[158158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:53 compute-0 python3[158161]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 27 17:03:53 compute-0 sudo[158158]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:54 compute-0 sudo[158311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqwpehlqxhgtwprixswwatwwjlmixavt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211833.756428-1200-143308034727861/AnsiballZ_stat.py'
Feb 27 17:03:54 compute-0 sudo[158311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:54 compute-0 python3.9[158314]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:54 compute-0 sudo[158311]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:54 compute-0 sudo[158390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxjgpujdifmhsuffgekkojnrqloxdfeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211833.756428-1200-143308034727861/AnsiballZ_file.py'
Feb 27 17:03:54 compute-0 sudo[158390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:54 compute-0 python3.9[158393]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:54 compute-0 sudo[158390]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:55 compute-0 sudo[158543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhtpbtrklonuohgkxtxfgyckryuaqtig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211835.0502427-1212-98214627273945/AnsiballZ_stat.py'
Feb 27 17:03:55 compute-0 sudo[158543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:55 compute-0 python3.9[158546]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:55 compute-0 sudo[158543]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:55 compute-0 sudo[158669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdoandiljqeguuckhrirvldrmqdndyzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211835.0502427-1212-98214627273945/AnsiballZ_copy.py'
Feb 27 17:03:56 compute-0 sudo[158669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:56 compute-0 python3.9[158672]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211835.0502427-1212-98214627273945/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:56 compute-0 sudo[158669]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:56 compute-0 sudo[158822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbzntohuffjteqlzqfwuswmmitvoudva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211836.3438928-1227-57652008910381/AnsiballZ_stat.py'
Feb 27 17:03:56 compute-0 sudo[158822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:56 compute-0 python3.9[158825]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:56 compute-0 sudo[158822]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:57 compute-0 sudo[158901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujmtaxckevwzdlzpbvqyaczwwomvdbvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211836.3438928-1227-57652008910381/AnsiballZ_file.py'
Feb 27 17:03:57 compute-0 sudo[158901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:57 compute-0 python3.9[158904]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:57 compute-0 sudo[158901]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:57 compute-0 sudo[159054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpyjxchipxgiijsnadvykoexwyukdzwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211837.5937762-1239-136605907735874/AnsiballZ_stat.py'
Feb 27 17:03:57 compute-0 sudo[159054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:58 compute-0 python3.9[159057]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:58 compute-0 sudo[159054]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:58 compute-0 sudo[159133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exrzdcbqonxecnehuckxcsjeuckeoflp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211837.5937762-1239-136605907735874/AnsiballZ_file.py'
Feb 27 17:03:58 compute-0 sudo[159133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:58 compute-0 python3.9[159136]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:58 compute-0 sudo[159133]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:59 compute-0 sudo[159286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uutqtahdgqgtycqmnagljiuqnjwuqgep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211838.777657-1251-162603017859628/AnsiballZ_stat.py'
Feb 27 17:03:59 compute-0 sudo[159286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:59 compute-0 python3.9[159289]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:03:59 compute-0 sudo[159286]: pam_unix(sudo:session): session closed for user root
Feb 27 17:03:59 compute-0 sudo[159412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evplmupeskgavrtdcnlshpbqtjcdwlcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211838.777657-1251-162603017859628/AnsiballZ_copy.py'
Feb 27 17:03:59 compute-0 sudo[159412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:03:59 compute-0 python3.9[159415]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772211838.777657-1251-162603017859628/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:03:59 compute-0 sudo[159412]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:00 compute-0 sudo[159565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psyirabtvfbcoutoifqhgjjmcmmambtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211840.141964-1266-261801605302427/AnsiballZ_file.py'
Feb 27 17:04:00 compute-0 sudo[159565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:00 compute-0 python3.9[159568]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:04:00 compute-0 sudo[159565]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:01 compute-0 sudo[159718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rveukpvthqcsqcqtwznrzrfbzppdetcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211840.9204502-1274-259822599468628/AnsiballZ_command.py'
Feb 27 17:04:01 compute-0 sudo[159718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:01 compute-0 python3.9[159721]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:04:01 compute-0 sudo[159718]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:02 compute-0 sudo[159874]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljxfgylxkpyliomvsijdkaelrprmrgxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211841.7575562-1282-187970336460172/AnsiballZ_blockinfile.py'
Feb 27 17:04:02 compute-0 sudo[159874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:02 compute-0 python3.9[159877]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:04:02 compute-0 sudo[159874]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:02 compute-0 sudo[160027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxqtvhuqeoheoppbdssfcfhcyvnwshds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211842.6833766-1291-10413730583701/AnsiballZ_command.py'
Feb 27 17:04:02 compute-0 sudo[160027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:03 compute-0 python3.9[160030]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:04:03 compute-0 sudo[160027]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:03 compute-0 sudo[160181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjsqwzvtvfiujlgcwnzgueuauqjuwtcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211843.3830395-1299-127864583817635/AnsiballZ_stat.py'
Feb 27 17:04:03 compute-0 sudo[160181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:03 compute-0 python3.9[160184]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:04:03 compute-0 sudo[160181]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:04 compute-0 sudo[160336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcjyctrqdonloxzqkagbjhfcxzgfzirm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211844.164242-1307-43625094681202/AnsiballZ_command.py'
Feb 27 17:04:04 compute-0 sudo[160336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:04 compute-0 python3.9[160339]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:04:04 compute-0 sudo[160336]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:05 compute-0 sudo[160492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vctwudeyusqyrqqyiiddzueimfhiwohz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211844.869972-1315-114257726296119/AnsiballZ_file.py'
Feb 27 17:04:05 compute-0 sudo[160492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:05 compute-0 python3.9[160495]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:04:05 compute-0 sudo[160492]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:05 compute-0 sudo[160645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzfvlmydcqmqaccnobthzhkbnlvsrskg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211845.5546174-1323-5776940464989/AnsiballZ_stat.py'
Feb 27 17:04:05 compute-0 sudo[160645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:06 compute-0 python3.9[160648]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:04:06 compute-0 sudo[160645]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:06 compute-0 sudo[160769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duevmvefthdubemhsyclyualiakdrgwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211845.5546174-1323-5776940464989/AnsiballZ_copy.py'
Feb 27 17:04:06 compute-0 sudo[160769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:06 compute-0 python3.9[160772]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211845.5546174-1323-5776940464989/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:04:06 compute-0 sudo[160769]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:07 compute-0 sudo[160922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tduncgnykfnwfuabsqjithosirdacgve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211846.9706972-1338-94596193794246/AnsiballZ_stat.py'
Feb 27 17:04:07 compute-0 sudo[160922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:07 compute-0 python3.9[160925]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:04:07 compute-0 sudo[160922]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:07 compute-0 sudo[161046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfrobikvhbfmsskzulhqspeixzwpvacq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211846.9706972-1338-94596193794246/AnsiballZ_copy.py'
Feb 27 17:04:07 compute-0 sudo[161046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:08 compute-0 python3.9[161049]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211846.9706972-1338-94596193794246/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:04:08 compute-0 sudo[161046]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:08 compute-0 sudo[161199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukhpbraefewtrzesqegsleukdlpsmlll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211848.288459-1353-18473338909902/AnsiballZ_stat.py'
Feb 27 17:04:08 compute-0 sudo[161199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:08 compute-0 python3.9[161202]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:04:08 compute-0 sudo[161199]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:09 compute-0 sudo[161323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjnpnktivdvkkbpdgzwnkwntfmovxdbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211848.288459-1353-18473338909902/AnsiballZ_copy.py'
Feb 27 17:04:09 compute-0 sudo[161323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:09 compute-0 python3.9[161326]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211848.288459-1353-18473338909902/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:04:09 compute-0 sudo[161323]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:09 compute-0 sudo[161476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phwzleiahvuzuijbpilgnffwzrvvdfge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211849.6531112-1368-208782590360101/AnsiballZ_systemd.py'
Feb 27 17:04:09 compute-0 sudo[161476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:10 compute-0 python3.9[161479]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:04:10 compute-0 systemd[1]: Reloading.
Feb 27 17:04:10 compute-0 systemd-rc-local-generator[161502]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:04:10 compute-0 systemd-sysv-generator[161505]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:04:10 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Feb 27 17:04:10 compute-0 sudo[161476]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:11 compute-0 sudo[161675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tscxbevntgivkiyysvlatvotroeklmex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211850.9278677-1376-1752139527308/AnsiballZ_systemd.py'
Feb 27 17:04:11 compute-0 sudo[161675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:11 compute-0 python3.9[161678]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 27 17:04:11 compute-0 systemd[1]: Reloading.
Feb 27 17:04:11 compute-0 systemd-rc-local-generator[161704]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:04:11 compute-0 systemd-sysv-generator[161709]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:04:12 compute-0 systemd[1]: Reloading.
Feb 27 17:04:12 compute-0 systemd-rc-local-generator[161753]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:04:12 compute-0 systemd-sysv-generator[161757]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:04:12 compute-0 sudo[161675]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:12 compute-0 sshd-session[106670]: Connection closed by 192.168.122.30 port 58468
Feb 27 17:04:12 compute-0 sshd-session[106667]: pam_unix(sshd:session): session closed for user zuul
Feb 27 17:04:12 compute-0 systemd-logind[803]: Session 22 logged out. Waiting for processes to exit.
Feb 27 17:04:12 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Feb 27 17:04:12 compute-0 systemd[1]: session-22.scope: Consumed 2min 59.594s CPU time.
Feb 27 17:04:12 compute-0 systemd-logind[803]: Removed session 22.
Feb 27 17:04:17 compute-0 sshd-session[161790]: Accepted publickey for zuul from 192.168.122.30 port 56574 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 17:04:17 compute-0 systemd-logind[803]: New session 23 of user zuul.
Feb 27 17:04:17 compute-0 systemd[1]: Started Session 23 of User zuul.
Feb 27 17:04:17 compute-0 sshd-session[161790]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 17:04:19 compute-0 python3.9[161943]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 17:04:20 compute-0 python3.9[162097]: ansible-ansible.builtin.service_facts Invoked
Feb 27 17:04:20 compute-0 network[162120]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 27 17:04:20 compute-0 network[162125]: 'network-scripts' will be removed from distribution in near future.
Feb 27 17:04:20 compute-0 network[162127]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 27 17:04:20 compute-0 podman[162102]: 2026-02-27 17:04:20.660060824 +0000 UTC m=+0.059497504 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 27 17:04:22 compute-0 podman[162210]: 2026-02-27 17:04:22.870868814 +0000 UTC m=+0.135022841 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 27 17:04:24 compute-0 sudo[162432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boxdgwdsxrgvkylrlqthxzsminxvjnlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211864.1096675-42-166340285075239/AnsiballZ_setup.py'
Feb 27 17:04:24 compute-0 sudo[162432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:24 compute-0 python3.9[162435]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 27 17:04:25 compute-0 sudo[162432]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:25 compute-0 sudo[162517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqufonresqsyjjveevdqnyntlzdgkjls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211864.1096675-42-166340285075239/AnsiballZ_dnf.py'
Feb 27 17:04:25 compute-0 sudo[162517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:25 compute-0 python3.9[162520]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 17:04:31 compute-0 sudo[162517]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:31 compute-0 sudo[162671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unkbnrlacwaqsdbbltcmdojsgooemwnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211871.3714457-54-149803858090435/AnsiballZ_stat.py'
Feb 27 17:04:31 compute-0 sudo[162671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:32 compute-0 python3.9[162674]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:04:32 compute-0 sudo[162671]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:32 compute-0 sudo[162824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dasynluysheaygsgvoiovuxonjrcfyes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211872.36712-64-7467945893979/AnsiballZ_command.py'
Feb 27 17:04:32 compute-0 sudo[162824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:33 compute-0 python3.9[162827]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:04:33 compute-0 sudo[162824]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:33 compute-0 sudo[162978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvbiigigsfpzegarggnpjejcjxlvtjcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211873.3755856-74-43078924523934/AnsiballZ_stat.py'
Feb 27 17:04:33 compute-0 sudo[162978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:33 compute-0 python3.9[162981]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:04:33 compute-0 sudo[162978]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:34 compute-0 sudo[163131]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ppkupsgsqvwacfgeoaxxynonvncxvoau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211874.1846774-82-184153257580302/AnsiballZ_command.py'
Feb 27 17:04:34 compute-0 sudo[163131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:34 compute-0 python3.9[163134]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:04:34 compute-0 sudo[163131]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:35 compute-0 sudo[163285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-socfsnymmbqdimxocfzkqnrehhleaood ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211875.0941343-90-90916182734940/AnsiballZ_stat.py'
Feb 27 17:04:35 compute-0 sudo[163285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:35 compute-0 python3.9[163288]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:04:35 compute-0 sudo[163285]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:36 compute-0 sudo[163409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vggftulezgovncwmdmdddkopgbkivddx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211875.0941343-90-90916182734940/AnsiballZ_copy.py'
Feb 27 17:04:36 compute-0 sudo[163409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:36 compute-0 python3.9[163412]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211875.0941343-90-90916182734940/.source.iscsi _original_basename=.7n8b351g follow=False checksum=76ac2cc7345bf5fc27638c38dc9d693e4c463bbe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:04:36 compute-0 sudo[163409]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:37 compute-0 sudo[163562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmygxizqrlpjjvmmmmldarlpqexnniqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211876.77048-105-148129496915065/AnsiballZ_file.py'
Feb 27 17:04:37 compute-0 sudo[163562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:37 compute-0 python3.9[163565]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:04:37 compute-0 sudo[163562]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:38 compute-0 sudo[163715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzzyengptnddjgdhudjjlhymdxjqqvvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211877.7103927-113-84663778673293/AnsiballZ_lineinfile.py'
Feb 27 17:04:38 compute-0 sudo[163715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:38 compute-0 python3.9[163718]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:04:38 compute-0 sudo[163715]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:39 compute-0 sudo[163868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmmoiyqamaxlrxbfalzrwbqbmzkcaibq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211878.5867143-122-93539561560107/AnsiballZ_systemd_service.py'
Feb 27 17:04:39 compute-0 sudo[163868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:39 compute-0 python3.9[163871]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:04:39 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 27 17:04:39 compute-0 sudo[163868]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:40 compute-0 sudo[164025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfggkskccgyvppfunrgufhzfckgawmnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211879.8088841-130-119494039227189/AnsiballZ_systemd_service.py'
Feb 27 17:04:40 compute-0 sudo[164025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:40 compute-0 python3.9[164028]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:04:40 compute-0 systemd[1]: Reloading.
Feb 27 17:04:40 compute-0 systemd-rc-local-generator[164055]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:04:40 compute-0 systemd-sysv-generator[164058]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:04:40 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 27 17:04:40 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 27 17:04:40 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Feb 27 17:04:40 compute-0 systemd[1]: Started Open-iSCSI.
Feb 27 17:04:40 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 27 17:04:40 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 27 17:04:40 compute-0 sudo[164025]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:41 compute-0 python3.9[164234]: ansible-ansible.builtin.service_facts Invoked
Feb 27 17:04:42 compute-0 network[164251]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 27 17:04:42 compute-0 network[164252]: 'network-scripts' will be removed from distribution in near future.
Feb 27 17:04:42 compute-0 network[164253]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 27 17:04:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:04:47.076 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:04:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:04:47.079 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:04:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:04:47.079 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:04:47 compute-0 sudo[164523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drlrfsnhomknewmuecewmhpgsnmeoqpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211887.4430945-153-101952740400974/AnsiballZ_dnf.py'
Feb 27 17:04:47 compute-0 sudo[164523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:48 compute-0 python3.9[164526]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 17:04:51 compute-0 podman[164534]: 2026-02-27 17:04:51.01262574 +0000 UTC m=+0.111359989 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:04:51 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 27 17:04:51 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 27 17:04:51 compute-0 systemd[1]: Reloading.
Feb 27 17:04:51 compute-0 systemd-rc-local-generator[164587]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:04:51 compute-0 systemd-sysv-generator[164594]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:04:51 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 27 17:04:52 compute-0 sudo[164523]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:53 compute-0 sudo[164888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jafmisfbzailtmwvjjzhobgkppoeeaiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211892.9482238-162-68489978796256/AnsiballZ_file.py'
Feb 27 17:04:53 compute-0 sudo[164888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:53 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 27 17:04:53 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 27 17:04:53 compute-0 systemd[1]: run-r6420c616a69e48d4870ff22be543f614.service: Deactivated successfully.
Feb 27 17:04:53 compute-0 podman[164851]: 2026-02-27 17:04:53.518484625 +0000 UTC m=+0.156965450 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260223, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:04:53 compute-0 python3.9[164897]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 27 17:04:53 compute-0 sudo[164888]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:54 compute-0 sudo[165058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elrxuonfssfkbufmsarjmnjbxzeghsgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211893.9154892-170-146922424372944/AnsiballZ_modprobe.py'
Feb 27 17:04:54 compute-0 sudo[165058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:54 compute-0 python3.9[165061]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 27 17:04:54 compute-0 sudo[165058]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:55 compute-0 sudo[165215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkwwmbpbxhcwajlgrlhgbdebtgdwtjyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211894.7977474-178-90757431575959/AnsiballZ_stat.py'
Feb 27 17:04:55 compute-0 sudo[165215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:55 compute-0 python3.9[165218]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:04:55 compute-0 sudo[165215]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:55 compute-0 sudo[165339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlfnfeotyyzuhjvmtvmxztuvzrdyquoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211894.7977474-178-90757431575959/AnsiballZ_copy.py'
Feb 27 17:04:55 compute-0 sudo[165339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:55 compute-0 python3.9[165342]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211894.7977474-178-90757431575959/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:04:55 compute-0 sudo[165339]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:56 compute-0 sudo[165492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bniweocvvpojlrlsudkdsdkwxzqjkkgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211896.1728108-194-25047788162101/AnsiballZ_lineinfile.py'
Feb 27 17:04:56 compute-0 sudo[165492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:56 compute-0 python3.9[165495]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:04:56 compute-0 sudo[165492]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:57 compute-0 sudo[165645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zchczbanwqrrmmkrroxrbeffjegknpgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211896.863442-202-148951633833373/AnsiballZ_systemd.py'
Feb 27 17:04:57 compute-0 sudo[165645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:57 compute-0 python3.9[165648]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 17:04:58 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 27 17:04:58 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 27 17:04:58 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 27 17:04:58 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 27 17:04:58 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 27 17:04:58 compute-0 sudo[165645]: pam_unix(sudo:session): session closed for user root
Feb 27 17:04:59 compute-0 sudo[165802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgvlystmaxsosaaxuhktgdnzudpumzoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211899.063196-210-232033963615651/AnsiballZ_command.py'
Feb 27 17:04:59 compute-0 sudo[165802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:04:59 compute-0 python3.9[165805]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:04:59 compute-0 sudo[165802]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:00 compute-0 sudo[165956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frovzqsmskxhocizrfuwwuahwinuynpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211900.0012336-220-228251589620802/AnsiballZ_stat.py'
Feb 27 17:05:00 compute-0 sudo[165956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:00 compute-0 python3.9[165959]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:05:00 compute-0 sudo[165956]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:01 compute-0 sudo[166109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-limbmmxduiolvtujakphhmviidngtacp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211900.7871094-229-77510161401724/AnsiballZ_stat.py'
Feb 27 17:05:01 compute-0 sudo[166109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:01 compute-0 python3.9[166112]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:05:01 compute-0 sudo[166109]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:01 compute-0 sudo[166233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwkdzcoofnwqcipomzwyzhdmxphdsnfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211900.7871094-229-77510161401724/AnsiballZ_copy.py'
Feb 27 17:05:01 compute-0 sudo[166233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:01 compute-0 python3.9[166236]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211900.7871094-229-77510161401724/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:01 compute-0 sudo[166233]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:02 compute-0 sudo[166386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twcgwctpgfizqaxanygjefhltengigao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211902.1420124-244-268663263591754/AnsiballZ_command.py'
Feb 27 17:05:02 compute-0 sudo[166386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:02 compute-0 python3.9[166389]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:05:02 compute-0 sudo[166386]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:03 compute-0 sudo[166540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbuehbwbtbuziclbeaghkcultwbzixiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211902.9135375-252-9775343573020/AnsiballZ_lineinfile.py'
Feb 27 17:05:03 compute-0 sudo[166540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:03 compute-0 python3.9[166543]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:03 compute-0 sudo[166540]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:04 compute-0 sudo[166693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywrlpcrtgjfedhkdcqkmyzdnepdonriu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211903.6874964-260-220681217615175/AnsiballZ_replace.py'
Feb 27 17:05:04 compute-0 sudo[166693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:04 compute-0 python3.9[166696]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:04 compute-0 sudo[166693]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:04 compute-0 sudo[166846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgbxtkwxxwnoeqtzpmrwmukrpbwtwxlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211904.6343567-268-212609923564356/AnsiballZ_replace.py'
Feb 27 17:05:04 compute-0 sudo[166846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:05 compute-0 python3.9[166849]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:05 compute-0 sudo[166846]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:05 compute-0 sudo[166999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgmxigajcxbhrhwdyhwgbvfcckkybhvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211905.466507-277-2641038783368/AnsiballZ_lineinfile.py'
Feb 27 17:05:05 compute-0 sudo[166999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:05 compute-0 python3.9[167002]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:06 compute-0 sudo[166999]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:06 compute-0 sudo[167152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwspwzhhpzaquxdvpllgqqbhvdzaanoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211906.1469333-277-270653665503919/AnsiballZ_lineinfile.py'
Feb 27 17:05:06 compute-0 sudo[167152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:06 compute-0 python3.9[167155]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:06 compute-0 sudo[167152]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:07 compute-0 sudo[167305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lexvaqytpxefzsjrcsvtluddyqugscfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211906.7954826-277-109229768551443/AnsiballZ_lineinfile.py'
Feb 27 17:05:07 compute-0 sudo[167305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:07 compute-0 python3.9[167308]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:07 compute-0 sudo[167305]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:07 compute-0 sudo[167458]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjpfjxuxffvhfddjoqtkjatktuelrxef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211907.4735968-277-215065047077204/AnsiballZ_lineinfile.py'
Feb 27 17:05:07 compute-0 sudo[167458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:07 compute-0 python3.9[167461]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:07 compute-0 sudo[167458]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:08 compute-0 sudo[167611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpvzosvseadazlinblxmnwntjaplrcxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211908.2041886-306-100299437319617/AnsiballZ_stat.py'
Feb 27 17:05:08 compute-0 sudo[167611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:08 compute-0 python3.9[167614]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:05:08 compute-0 sudo[167611]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:09 compute-0 sudo[167766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmabqpymwoxeekvxalqebhtsvjelbcjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211908.9423454-314-153379354471584/AnsiballZ_command.py'
Feb 27 17:05:09 compute-0 sudo[167766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:09 compute-0 python3.9[167769]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:05:09 compute-0 sudo[167766]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:10 compute-0 sudo[167920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkuyhszhbcyxxwksrpfogujdxfkdjljj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211909.7549314-323-122960584622531/AnsiballZ_systemd_service.py'
Feb 27 17:05:10 compute-0 sudo[167920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:10 compute-0 python3.9[167923]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:05:10 compute-0 systemd[1]: Listening on multipathd control socket.
Feb 27 17:05:10 compute-0 sudo[167920]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:10 compute-0 sudo[168077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drodbzdettrfzxasswmeaphtlrylswtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211910.690822-331-235780944191669/AnsiballZ_systemd_service.py'
Feb 27 17:05:11 compute-0 sudo[168077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:11 compute-0 python3.9[168080]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:05:11 compute-0 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 27 17:05:11 compute-0 udevadm[168085]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 27 17:05:11 compute-0 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 27 17:05:11 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 27 17:05:11 compute-0 multipathd[168089]: --------start up--------
Feb 27 17:05:11 compute-0 multipathd[168089]: read /etc/multipath.conf
Feb 27 17:05:11 compute-0 multipathd[168089]: path checkers start up
Feb 27 17:05:11 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 27 17:05:11 compute-0 sudo[168077]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:12 compute-0 sudo[168247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfauzltbiosdljyywktnlixcbipbhxjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211912.0749927-343-34004341941842/AnsiballZ_file.py'
Feb 27 17:05:12 compute-0 sudo[168247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:12 compute-0 python3.9[168250]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 27 17:05:12 compute-0 sudo[168247]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:13 compute-0 sudo[168400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkqvhrikjshkykavukxarvbjfojdpzhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211912.7840376-351-234280003842197/AnsiballZ_modprobe.py'
Feb 27 17:05:13 compute-0 sudo[168400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:13 compute-0 python3.9[168403]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 27 17:05:13 compute-0 kernel: Key type psk registered
Feb 27 17:05:13 compute-0 sudo[168400]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:13 compute-0 sudo[168562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsbmdjduckchayqhcsiwtqwbayhbsqpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211913.581112-359-146329202738409/AnsiballZ_stat.py'
Feb 27 17:05:13 compute-0 sudo[168562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:14 compute-0 python3.9[168565]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:05:14 compute-0 sudo[168562]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:14 compute-0 sudo[168686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kegatvgoexpieytiydoocfpnwayymncn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211913.581112-359-146329202738409/AnsiballZ_copy.py'
Feb 27 17:05:14 compute-0 sudo[168686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:14 compute-0 python3.9[168689]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211913.581112-359-146329202738409/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:14 compute-0 sudo[168686]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:15 compute-0 sudo[168839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzrkxsqnhgmgsnhcsnneajsrvdtuvqti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211915.1239634-375-50351349830685/AnsiballZ_lineinfile.py'
Feb 27 17:05:15 compute-0 sudo[168839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:15 compute-0 python3.9[168842]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:15 compute-0 sudo[168839]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:16 compute-0 sudo[168992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziwojchzpjryoxepychwqlttormghohq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211915.882345-383-255668049202433/AnsiballZ_systemd.py'
Feb 27 17:05:16 compute-0 sudo[168992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:16 compute-0 python3.9[168995]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 17:05:16 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 27 17:05:16 compute-0 systemd[1]: Stopped Load Kernel Modules.
Feb 27 17:05:16 compute-0 systemd[1]: Stopping Load Kernel Modules...
Feb 27 17:05:16 compute-0 systemd[1]: Starting Load Kernel Modules...
Feb 27 17:05:16 compute-0 systemd[1]: Finished Load Kernel Modules.
Feb 27 17:05:16 compute-0 sudo[168992]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:17 compute-0 sudo[169150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvrecwizqujrirdtnnlyzmmwaimocvno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211917.0597084-391-151039113018561/AnsiballZ_dnf.py'
Feb 27 17:05:17 compute-0 sudo[169150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:17 compute-0 python3.9[169153]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 27 17:05:20 compute-0 systemd[1]: Reloading.
Feb 27 17:05:20 compute-0 systemd-rc-local-generator[169183]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:05:20 compute-0 systemd-sysv-generator[169190]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:05:20 compute-0 systemd[1]: Reloading.
Feb 27 17:05:20 compute-0 systemd-sysv-generator[169231]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:05:20 compute-0 systemd-rc-local-generator[169228]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:05:20 compute-0 virtproxyd[155915]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 27 17:05:20 compute-0 virtproxyd[155915]: hostname: compute-0
Feb 27 17:05:20 compute-0 virtproxyd[155915]: nl_recv returned with error: No buffer space available
Feb 27 17:05:20 compute-0 systemd-logind[803]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 27 17:05:21 compute-0 systemd-logind[803]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 27 17:05:21 compute-0 podman[169280]: 2026-02-27 17:05:21.113054132 +0000 UTC m=+0.053227478 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb 27 17:05:21 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 27 17:05:21 compute-0 systemd[1]: Starting man-db-cache-update.service...
Feb 27 17:05:21 compute-0 systemd[1]: Reloading.
Feb 27 17:05:21 compute-0 systemd-rc-local-generator[169352]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:05:21 compute-0 systemd-sysv-generator[169359]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:05:21 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Feb 27 17:05:22 compute-0 sudo[169150]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:22 compute-0 sudo[170665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyzxyhxboijvunxxtfeqzaxpgiaqgznq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211922.573389-399-160734752341524/AnsiballZ_systemd_service.py'
Feb 27 17:05:22 compute-0 sudo[170665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:23 compute-0 python3.9[170668]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 17:05:23 compute-0 iscsid[164074]: iscsid shutting down.
Feb 27 17:05:23 compute-0 systemd[1]: Stopping Open-iSCSI...
Feb 27 17:05:23 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Feb 27 17:05:23 compute-0 systemd[1]: Stopped Open-iSCSI.
Feb 27 17:05:23 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 27 17:05:23 compute-0 systemd[1]: Starting Open-iSCSI...
Feb 27 17:05:23 compute-0 systemd[1]: Started Open-iSCSI.
Feb 27 17:05:23 compute-0 sudo[170665]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:23 compute-0 podman[170726]: 2026-02-27 17:05:23.724982491 +0000 UTC m=+0.113232456 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 27 17:05:23 compute-0 sudo[170849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijnbuvneijzmafhmlfsjkvbdbqnunyhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211923.5524533-407-76089103563092/AnsiballZ_systemd_service.py'
Feb 27 17:05:23 compute-0 sudo[170849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:24 compute-0 python3.9[170852]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 17:05:24 compute-0 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 27 17:05:24 compute-0 multipathd[168089]: exit (signal)
Feb 27 17:05:24 compute-0 multipathd[168089]: --------shut down-------
Feb 27 17:05:24 compute-0 systemd[1]: multipathd.service: Deactivated successfully.
Feb 27 17:05:24 compute-0 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 27 17:05:24 compute-0 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 27 17:05:24 compute-0 multipathd[170858]: --------start up--------
Feb 27 17:05:24 compute-0 multipathd[170858]: read /etc/multipath.conf
Feb 27 17:05:24 compute-0 multipathd[170858]: path checkers start up
Feb 27 17:05:24 compute-0 systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 27 17:05:24 compute-0 sudo[170849]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:25 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 27 17:05:25 compute-0 systemd[1]: Finished man-db-cache-update.service.
Feb 27 17:05:25 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.308s CPU time.
Feb 27 17:05:25 compute-0 systemd[1]: run-r72d81938a35c4961847767b8895fb7b8.service: Deactivated successfully.
Feb 27 17:05:25 compute-0 python3.9[171016]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 17:05:26 compute-0 sudo[171171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evfqhqmnwhkfnarfsypdinldkfffbbmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211925.886307-425-116015366229117/AnsiballZ_file.py'
Feb 27 17:05:26 compute-0 sudo[171171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:26 compute-0 python3.9[171174]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:26 compute-0 sudo[171171]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:27 compute-0 sudo[171324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtrebwybpokpqwhbmcxltckcpdegudaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211926.861207-436-258801694489723/AnsiballZ_systemd_service.py'
Feb 27 17:05:27 compute-0 sudo[171324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:27 compute-0 python3.9[171327]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 17:05:27 compute-0 systemd[1]: Reloading.
Feb 27 17:05:27 compute-0 systemd-rc-local-generator[171353]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:05:27 compute-0 systemd-sysv-generator[171357]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:05:27 compute-0 sudo[171324]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:28 compute-0 python3.9[171518]: ansible-ansible.builtin.service_facts Invoked
Feb 27 17:05:28 compute-0 network[171535]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 27 17:05:28 compute-0 network[171536]: 'network-scripts' will be removed from distribution in near future.
Feb 27 17:05:28 compute-0 network[171537]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 27 17:05:33 compute-0 sudo[171808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vusgiitvjhjnfinzxhsvnrcinwbzkplj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211932.9498756-455-141863813008495/AnsiballZ_systemd_service.py'
Feb 27 17:05:33 compute-0 sudo[171808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:33 compute-0 python3.9[171811]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:05:33 compute-0 sudo[171808]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:34 compute-0 sudo[171962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwslxqxoycjkjazjfdpskzufwmczuqtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211933.8323772-455-64558866404369/AnsiballZ_systemd_service.py'
Feb 27 17:05:34 compute-0 sudo[171962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:34 compute-0 python3.9[171965]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:05:34 compute-0 sudo[171962]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:34 compute-0 sudo[172116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uutngmzrcssfzgsslggeymekhmbcxhmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211934.5532184-455-35526703702927/AnsiballZ_systemd_service.py'
Feb 27 17:05:34 compute-0 sudo[172116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:35 compute-0 python3.9[172119]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:05:35 compute-0 sudo[172116]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:35 compute-0 sudo[172270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saxbhtonsdpdprgjopddwskcapreydvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211935.3535275-455-113724253428148/AnsiballZ_systemd_service.py'
Feb 27 17:05:35 compute-0 sudo[172270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:36 compute-0 python3.9[172273]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:05:36 compute-0 sudo[172270]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:36 compute-0 sudo[172424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylxswymqdfipdkyruyesoabjzihslani ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211936.22997-455-117260672961326/AnsiballZ_systemd_service.py'
Feb 27 17:05:36 compute-0 sudo[172424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:36 compute-0 python3.9[172427]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:05:36 compute-0 sudo[172424]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:37 compute-0 sudo[172578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiordmzcomvaojnojnlulkhcgqmowpsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211937.1069458-455-173100282500637/AnsiballZ_systemd_service.py'
Feb 27 17:05:37 compute-0 sudo[172578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:37 compute-0 python3.9[172581]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:05:37 compute-0 sudo[172578]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:38 compute-0 sudo[172732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evrqvpgvjaqvtyookekbuszvdxdqwnut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211937.9958694-455-101930651614327/AnsiballZ_systemd_service.py'
Feb 27 17:05:38 compute-0 sudo[172732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:38 compute-0 python3.9[172735]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:05:38 compute-0 sudo[172732]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:39 compute-0 sudo[172886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okyzfsuxnkguhngswqnrrkftwdhmyeqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211938.8349059-455-86516826158728/AnsiballZ_systemd_service.py'
Feb 27 17:05:39 compute-0 sudo[172886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:39 compute-0 python3.9[172889]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:05:39 compute-0 sudo[172886]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:40 compute-0 sudo[173040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sikryjrmlzoahugytrkavxsspbowyklz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211939.7883425-514-91596068096780/AnsiballZ_file.py'
Feb 27 17:05:40 compute-0 sudo[173040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:40 compute-0 python3.9[173043]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:40 compute-0 sudo[173040]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:40 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 27 17:05:40 compute-0 sudo[173194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syxouhrhgeyqarlkbiaabowkaltbyorl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211940.4985723-514-100384176459618/AnsiballZ_file.py'
Feb 27 17:05:40 compute-0 sudo[173194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:41 compute-0 python3.9[173197]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:41 compute-0 sudo[173194]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:41 compute-0 sudo[173347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umuztrwcpcryvvbbzilwjabjoxlqhlnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211941.2376456-514-133457028761840/AnsiballZ_file.py'
Feb 27 17:05:41 compute-0 sudo[173347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:41 compute-0 python3.9[173350]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:41 compute-0 sudo[173347]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:41 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 27 17:05:42 compute-0 sudo[173501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egoguosgtnknorsliwpgkwxzfgumwvmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211941.888049-514-113891011614765/AnsiballZ_file.py'
Feb 27 17:05:42 compute-0 sudo[173501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:42 compute-0 python3.9[173504]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:42 compute-0 sudo[173501]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:42 compute-0 sudo[173654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzpmcsmzlyitensjtmxridfdpuvouypq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211942.5770774-514-54377282289986/AnsiballZ_file.py'
Feb 27 17:05:42 compute-0 sudo[173654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:43 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Feb 27 17:05:43 compute-0 python3.9[173657]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:43 compute-0 sudo[173654]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:43 compute-0 sudo[173808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywcrjzpiqnaqwtbfindzgdbrcpniszje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211943.310245-514-41612327259471/AnsiballZ_file.py'
Feb 27 17:05:43 compute-0 sudo[173808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:43 compute-0 python3.9[173811]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:43 compute-0 sudo[173808]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:44 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 27 17:05:44 compute-0 sudo[173962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpmdmdfcrazpkbtjxhdmsefblhcidsfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211944.0387368-514-94817436188793/AnsiballZ_file.py'
Feb 27 17:05:44 compute-0 sudo[173962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:44 compute-0 python3.9[173965]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:44 compute-0 sudo[173962]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:45 compute-0 sudo[174115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apfrukkxjrcasadtjxvksfqnnmkzemdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211944.7386556-514-249105551581742/AnsiballZ_file.py'
Feb 27 17:05:45 compute-0 sudo[174115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:45 compute-0 python3.9[174118]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:45 compute-0 sudo[174115]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:45 compute-0 sudo[174268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juzxvolpqmjjpmonuhoqufktipvanvxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211945.481502-571-73061358570657/AnsiballZ_file.py'
Feb 27 17:05:45 compute-0 sudo[174268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:46 compute-0 python3.9[174271]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:46 compute-0 sudo[174268]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:46 compute-0 sudo[174421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eskzthfdcrqtxznpvgtdxkluvgieqdrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211946.2655442-571-98458743617715/AnsiballZ_file.py'
Feb 27 17:05:46 compute-0 sudo[174421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:46 compute-0 python3.9[174424]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:46 compute-0 sudo[174421]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:05:47.077 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:05:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:05:47.079 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:05:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:05:47.079 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:05:47 compute-0 sudo[174574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgdoicolajwujefwhcenrbotxxhkxngs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211946.9641051-571-192963497419906/AnsiballZ_file.py'
Feb 27 17:05:47 compute-0 sudo[174574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:47 compute-0 python3.9[174577]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:47 compute-0 sudo[174574]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:48 compute-0 sudo[174727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvhrdaiztksrcmxpsexjpvjjrfaaczyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211947.8334486-571-33876327756057/AnsiballZ_file.py'
Feb 27 17:05:48 compute-0 sudo[174727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:48 compute-0 python3.9[174730]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:48 compute-0 sudo[174727]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:48 compute-0 sudo[174880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sphmfboemhwfkxpelktlhuettrxfranc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211948.4987302-571-212472388522645/AnsiballZ_file.py'
Feb 27 17:05:48 compute-0 sudo[174880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:49 compute-0 python3.9[174883]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:49 compute-0 sudo[174880]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:50 compute-0 sudo[175033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpecscbpgdinlwomvrmygqgcrypvvbsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211949.933764-571-279246362403078/AnsiballZ_file.py'
Feb 27 17:05:50 compute-0 sudo[175033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:50 compute-0 python3.9[175036]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:50 compute-0 sudo[175033]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:50 compute-0 sudo[175186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgajqedgiwwohqvkcjmirsxmnjubhpvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211950.6214669-571-147369333136120/AnsiballZ_file.py'
Feb 27 17:05:50 compute-0 sudo[175186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:51 compute-0 python3.9[175189]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:51 compute-0 sudo[175186]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:51 compute-0 podman[175190]: 2026-02-27 17:05:51.384588674 +0000 UTC m=+0.081720972 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:05:51 compute-0 sudo[175357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eszyipxnyofzdepopkunczhysoizjbuk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211951.4110124-571-170144865317490/AnsiballZ_file.py'
Feb 27 17:05:51 compute-0 sudo[175357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:51 compute-0 python3.9[175360]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:05:51 compute-0 sudo[175357]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:52 compute-0 sudo[175510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxzuzaljdezraynoluldoqntvdmgziqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211952.1620746-629-215142952273210/AnsiballZ_command.py'
Feb 27 17:05:52 compute-0 sudo[175510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:52 compute-0 python3.9[175513]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:05:52 compute-0 sudo[175510]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:53 compute-0 python3.9[175665]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 27 17:05:54 compute-0 sudo[175828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njtiooorfofjslsyubqygfhdnzzehbnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211953.9267285-647-2460778780595/AnsiballZ_systemd_service.py'
Feb 27 17:05:54 compute-0 sudo[175828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:54 compute-0 podman[175789]: 2026-02-27 17:05:54.366917931 +0000 UTC m=+0.117218630 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS)
Feb 27 17:05:54 compute-0 python3.9[175839]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 17:05:54 compute-0 systemd[1]: Reloading.
Feb 27 17:05:54 compute-0 systemd-rc-local-generator[175872]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:05:54 compute-0 systemd-sysv-generator[175876]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:05:54 compute-0 sudo[175828]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:55 compute-0 sudo[176037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afaeqewhzwizvtcckoieyjsfwodungbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211955.100454-655-84592276003705/AnsiballZ_command.py'
Feb 27 17:05:55 compute-0 sudo[176037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:55 compute-0 python3.9[176040]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:05:55 compute-0 sudo[176037]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:56 compute-0 sudo[176191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsvjyxwbzmwrzkmuhtqvhvlcofgoxmxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211955.76769-655-241890507006848/AnsiballZ_command.py'
Feb 27 17:05:56 compute-0 sudo[176191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:56 compute-0 python3.9[176194]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:05:56 compute-0 sudo[176191]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:56 compute-0 sudo[176345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmehxgqpbwmevywjegjyddyntovqbqov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211956.3799837-655-42073436112146/AnsiballZ_command.py'
Feb 27 17:05:56 compute-0 sudo[176345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:56 compute-0 python3.9[176348]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:05:56 compute-0 sudo[176345]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:57 compute-0 sudo[176499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgnusdfjaywqffpdvmwhnfebzbychmfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211957.0191357-655-92662850349687/AnsiballZ_command.py'
Feb 27 17:05:57 compute-0 sudo[176499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:57 compute-0 python3.9[176502]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:05:57 compute-0 sudo[176499]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:58 compute-0 sudo[176653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhwebsrtrxjypxjuakdmiswebprwlqfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211957.734005-655-281159260866497/AnsiballZ_command.py'
Feb 27 17:05:58 compute-0 sudo[176653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:58 compute-0 python3.9[176656]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:05:58 compute-0 sudo[176653]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:58 compute-0 sudo[176807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhhdikqispikezchldsvfwmaaicwvdqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211958.467735-655-50255604169520/AnsiballZ_command.py'
Feb 27 17:05:58 compute-0 sudo[176807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:58 compute-0 python3.9[176810]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:05:59 compute-0 sudo[176807]: pam_unix(sudo:session): session closed for user root
Feb 27 17:05:59 compute-0 sudo[176961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euuibdrkqouetywmsziibmivvmgltfvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211959.1431186-655-245370227845989/AnsiballZ_command.py'
Feb 27 17:05:59 compute-0 sudo[176961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:05:59 compute-0 python3.9[176964]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:05:59 compute-0 sudo[176961]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:00 compute-0 sudo[177115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yelqwkpyqvagoiuqouutchnozbxfdrxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211959.848585-655-107626066392340/AnsiballZ_command.py'
Feb 27 17:06:00 compute-0 sudo[177115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:00 compute-0 python3.9[177118]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:06:00 compute-0 sudo[177115]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:01 compute-0 sudo[177269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvsswuounrdzywozcovjroldywptvplc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211961.351141-734-198177217777433/AnsiballZ_file.py'
Feb 27 17:06:01 compute-0 sudo[177269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:01 compute-0 python3.9[177272]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:01 compute-0 sudo[177269]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:02 compute-0 sudo[177422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azouuzzpbegxdetguodtqelbvdwxzwmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211962.0918627-734-206005410300938/AnsiballZ_file.py'
Feb 27 17:06:02 compute-0 sudo[177422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:02 compute-0 python3.9[177425]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:02 compute-0 sudo[177422]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:03 compute-0 sudo[177575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bakvpdgbqabxkbynjavpqbnmxgkreeop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211962.743307-749-198094351010800/AnsiballZ_file.py'
Feb 27 17:06:03 compute-0 sudo[177575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:03 compute-0 python3.9[177578]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:03 compute-0 sudo[177575]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:03 compute-0 sudo[177728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buxetzozecvezxkruhrzxzimyvwexson ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211963.3440456-749-90764794470890/AnsiballZ_file.py'
Feb 27 17:06:03 compute-0 sudo[177728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:03 compute-0 python3.9[177731]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:03 compute-0 sudo[177728]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:04 compute-0 sudo[177881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avoreaglgakeelbmevisjjshkoiicrpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211963.9445562-749-127097101103153/AnsiballZ_file.py'
Feb 27 17:06:04 compute-0 sudo[177881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:04 compute-0 python3.9[177884]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:04 compute-0 sudo[177881]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:04 compute-0 sudo[178034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oufqvzthdtfryhcjzfhuzrrxmzwqzieo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211964.6678758-749-30627403848549/AnsiballZ_file.py'
Feb 27 17:06:04 compute-0 sudo[178034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:05 compute-0 python3.9[178037]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:05 compute-0 sudo[178034]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:05 compute-0 sudo[178187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yunhenttraguwnimadhvuhxocrecncwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211965.304991-749-239736244817029/AnsiballZ_file.py'
Feb 27 17:06:05 compute-0 sudo[178187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:05 compute-0 python3.9[178190]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:05 compute-0 sudo[178187]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:06 compute-0 sudo[178340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsiajmsejqxhabhggranganguvjeycuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211966.096175-749-138372564699326/AnsiballZ_file.py'
Feb 27 17:06:06 compute-0 sudo[178340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:06 compute-0 python3.9[178343]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:06 compute-0 sudo[178340]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:07 compute-0 sudo[178493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sadlhooilcqhublhcqqzvhdztkxkgnjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211966.8383548-749-92456103583642/AnsiballZ_file.py'
Feb 27 17:06:07 compute-0 sudo[178493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:07 compute-0 python3.9[178496]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:07 compute-0 sudo[178493]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:13 compute-0 sudo[178646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eujqgrwyxoaxfiyyhfqieznywutannfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211973.0625138-958-269578121363156/AnsiballZ_getent.py'
Feb 27 17:06:13 compute-0 sudo[178646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:13 compute-0 python3.9[178649]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 27 17:06:13 compute-0 sudo[178646]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:14 compute-0 sudo[178800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfjthzkcfguuyfsrqaygpjhucxhlokjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211973.930402-966-196576015422213/AnsiballZ_group.py'
Feb 27 17:06:14 compute-0 sudo[178800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:14 compute-0 python3.9[178803]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 27 17:06:14 compute-0 groupadd[178804]: group added to /etc/group: name=nova, GID=42436
Feb 27 17:06:14 compute-0 groupadd[178804]: group added to /etc/gshadow: name=nova
Feb 27 17:06:14 compute-0 groupadd[178804]: new group: name=nova, GID=42436
Feb 27 17:06:14 compute-0 sudo[178800]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:15 compute-0 sudo[178959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njflypxxzocyrqmfpfadxkeppqqanzvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211974.8367383-974-136569594084876/AnsiballZ_user.py'
Feb 27 17:06:15 compute-0 sudo[178959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:15 compute-0 python3.9[178962]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 27 17:06:15 compute-0 useradd[178964]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Feb 27 17:06:15 compute-0 useradd[178964]: add 'nova' to group 'libvirt'
Feb 27 17:06:15 compute-0 useradd[178964]: add 'nova' to shadow group 'libvirt'
Feb 27 17:06:15 compute-0 sudo[178959]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:16 compute-0 sshd-session[178995]: Accepted publickey for zuul from 192.168.122.30 port 44190 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 17:06:16 compute-0 systemd-logind[803]: New session 24 of user zuul.
Feb 27 17:06:16 compute-0 systemd[1]: Started Session 24 of User zuul.
Feb 27 17:06:16 compute-0 sshd-session[178995]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 17:06:16 compute-0 sshd-session[178998]: Received disconnect from 192.168.122.30 port 44190:11: disconnected by user
Feb 27 17:06:16 compute-0 sshd-session[178998]: Disconnected from user zuul 192.168.122.30 port 44190
Feb 27 17:06:16 compute-0 sshd-session[178995]: pam_unix(sshd:session): session closed for user zuul
Feb 27 17:06:16 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Feb 27 17:06:16 compute-0 systemd-logind[803]: Session 24 logged out. Waiting for processes to exit.
Feb 27 17:06:16 compute-0 systemd-logind[803]: Removed session 24.
Feb 27 17:06:17 compute-0 python3.9[179148]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:06:18 compute-0 python3.9[179224]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:18 compute-0 python3.9[179374]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:06:19 compute-0 python3.9[179495]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211978.3478267-999-220788899245366/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:20 compute-0 python3.9[179645]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:06:20 compute-0 python3.9[179766]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211979.6778588-999-45460217871315/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:21 compute-0 python3.9[179916]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:06:21 compute-0 podman[179964]: 2026-02-27 17:06:21.705053925 +0000 UTC m=+0.095462229 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0)
Feb 27 17:06:21 compute-0 python3.9[180055]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211980.9357965-999-271479686162968/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:22 compute-0 python3.9[180205]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:06:23 compute-0 python3.9[180326]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772211982.189525-1053-277489480696729/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:23 compute-0 sudo[180476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vubusbfweevfdajkaqruftrvibxqemlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211983.489324-1068-137549347486171/AnsiballZ_file.py'
Feb 27 17:06:23 compute-0 sudo[180476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:24 compute-0 python3.9[180479]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:06:24 compute-0 sudo[180476]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:24 compute-0 sudo[180644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgmtkddnkrkofceqzhkknwxwhqjkyzri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211984.335203-1076-89346850534883/AnsiballZ_copy.py'
Feb 27 17:06:24 compute-0 sudo[180644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:24 compute-0 podman[180603]: 2026-02-27 17:06:24.680858516 +0000 UTC m=+0.090733473 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:06:24 compute-0 python3.9[180650]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:06:24 compute-0 sudo[180644]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:25 compute-0 sudo[180807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptbvricrqlmsuwajefdarqlehadjsirq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211985.0758014-1084-249200071099220/AnsiballZ_stat.py'
Feb 27 17:06:25 compute-0 sudo[180807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:25 compute-0 python3.9[180810]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:06:25 compute-0 sudo[180807]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:26 compute-0 sudo[180960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acrjrxlslqbdbctvlanoyknpwxcjjqer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211985.8082387-1092-66848277340732/AnsiballZ_stat.py'
Feb 27 17:06:26 compute-0 sudo[180960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:26 compute-0 python3.9[180963]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:06:26 compute-0 sudo[180960]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:26 compute-0 sudo[181084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjpuggqdplplgiphxbvxhcrxrihpnapm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211985.8082387-1092-66848277340732/AnsiballZ_copy.py'
Feb 27 17:06:26 compute-0 sudo[181084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:27 compute-0 python3.9[181087]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1772211985.8082387-1092-66848277340732/.source _original_basename=.t0_6uc61 follow=False checksum=6c2caf8dd75b402a4089eb555e03cff33f7a3a3e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Feb 27 17:06:27 compute-0 sudo[181084]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:27 compute-0 python3.9[181239]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:06:28 compute-0 sudo[181391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbsghyjxzdxkaatjdzvwzzpjhtbxyxpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211988.3743355-1120-125137484885991/AnsiballZ_file.py'
Feb 27 17:06:28 compute-0 sudo[181391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:28 compute-0 python3.9[181394]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:06:28 compute-0 sudo[181391]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:29 compute-0 sudo[181544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzztqjpoczxnpwwsnobdhzoczlqoksok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211989.0383253-1128-267307122675556/AnsiballZ_file.py'
Feb 27 17:06:29 compute-0 sudo[181544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:29 compute-0 python3.9[181547]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:29 compute-0 sudo[181544]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:30 compute-0 python3.9[181697]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:06:32 compute-0 sudo[182118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etwzfwrcghcwktdjwnzosotxsfgxfwwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211992.0654597-1162-132986614745236/AnsiballZ_container_config_data.py'
Feb 27 17:06:32 compute-0 sudo[182118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:32 compute-0 python3.9[182121]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 27 17:06:32 compute-0 sudo[182118]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:33 compute-0 sudo[182271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynlaretlzkjrkrhwnpccyyqzyvvfmkkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211993.223663-1173-80315065267292/AnsiballZ_container_config_hash.py'
Feb 27 17:06:33 compute-0 sudo[182271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:33 compute-0 python3.9[182274]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 27 17:06:33 compute-0 sudo[182271]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:34 compute-0 sudo[182424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlomibupzlwvqdbythuigchyngliitrq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772211994.300285-1183-147240417545262/AnsiballZ_edpm_container_manage.py'
Feb 27 17:06:34 compute-0 sudo[182424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:35 compute-0 python3[182427]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 27 17:06:35 compute-0 podman[182466]: 2026-02-27 17:06:35.370787901 +0000 UTC m=+0.085181818 container create f648cc0ff9f14ed6dce27979949913ea534bf6e5420fb2b5dbb5b689dfada157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute_init, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:06:35 compute-0 podman[182466]: 2026-02-27 17:06:35.309170079 +0000 UTC m=+0.023564046 image pull 7e637240710437807d86f704ec92f4417e40d6b1f76088848cab504c91655fe5 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 27 17:06:35 compute-0 python3[182427]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498 --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 27 17:06:35 compute-0 sudo[182424]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:36 compute-0 sudo[182654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqkgexqitktihncizwrflxixxlpkvlke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211995.7355154-1191-273436043988468/AnsiballZ_stat.py'
Feb 27 17:06:36 compute-0 sudo[182654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:36 compute-0 python3.9[182657]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:06:36 compute-0 sudo[182654]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:37 compute-0 python3.9[182809]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 27 17:06:38 compute-0 sudo[182959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gscdkczmkpsrupghploushuuvawnuhss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211997.9848468-1218-175195522500432/AnsiballZ_stat.py'
Feb 27 17:06:38 compute-0 sudo[182959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:38 compute-0 python3.9[182962]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:06:38 compute-0 sudo[182959]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:38 compute-0 sudo[183085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdmqdjfufcxszsrcplzmhlnkwkxynxej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211997.9848468-1218-175195522500432/AnsiballZ_copy.py'
Feb 27 17:06:38 compute-0 sudo[183085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:39 compute-0 python3.9[183088]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772211997.9848468-1218-175195522500432/.source.yaml _original_basename=.azjmmefy follow=False checksum=7ccff526751bca89bcdba7d2482756f31bce0861 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:06:39 compute-0 sudo[183085]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:39 compute-0 sudo[183238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixwkbptcviyilwrsigaduolvepaljbgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772211999.5277581-1235-177518349600110/AnsiballZ_file.py'
Feb 27 17:06:39 compute-0 sudo[183238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:40 compute-0 python3.9[183241]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:06:40 compute-0 sudo[183238]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:40 compute-0 sudo[183391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ladzeuomhejcjwjxhugrbzwakalceaon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212000.227363-1243-234582900750261/AnsiballZ_file.py'
Feb 27 17:06:40 compute-0 sudo[183391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:40 compute-0 python3.9[183394]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:06:40 compute-0 sudo[183391]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:41 compute-0 sudo[183544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxtpvmqxgsugcakazapkcamjqvdwvhjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212000.9285824-1251-64371048307617/AnsiballZ_stat.py'
Feb 27 17:06:41 compute-0 sudo[183544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:41 compute-0 python3.9[183547]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:06:41 compute-0 sudo[183544]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:41 compute-0 sudo[183668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlskgrvckzfifetmjnsioqnjzzrfvxbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212000.9285824-1251-64371048307617/AnsiballZ_copy.py'
Feb 27 17:06:41 compute-0 sudo[183668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:42 compute-0 python3.9[183671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1772212000.9285824-1251-64371048307617/.source.json _original_basename=.pt7bcdik follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:06:42 compute-0 sudo[183668]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:42 compute-0 python3.9[183821]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:06:44 compute-0 sudo[184242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtisnmmooqzudrosmyjptmymspmskysf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212004.4443264-1291-227663385698638/AnsiballZ_container_config_data.py'
Feb 27 17:06:44 compute-0 sudo[184242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:44 compute-0 python3.9[184245]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 27 17:06:44 compute-0 sudo[184242]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:45 compute-0 sudo[184395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cotokbbkxcebuvvssgokezubllpdonvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212005.3561761-1302-57628806682515/AnsiballZ_container_config_hash.py'
Feb 27 17:06:45 compute-0 sudo[184395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:45 compute-0 python3.9[184398]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 27 17:06:45 compute-0 sudo[184395]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:46 compute-0 sudo[184548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtefdkpjltwuepoxvgxsqsvvnamlbdcy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772212006.285006-1312-94992031289543/AnsiballZ_edpm_container_manage.py'
Feb 27 17:06:46 compute-0 sudo[184548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:46 compute-0 python3[184551]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 27 17:06:47 compute-0 podman[184588]: 2026-02-27 17:06:47.037015433 +0000 UTC m=+0.048462823 container create b3e826e66e8361bcedd1ed1be07c8b5377156cfdce04f4d4f04579c1c037ec92 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=nova_compute)
Feb 27 17:06:47 compute-0 podman[184588]: 2026-02-27 17:06:47.014659748 +0000 UTC m=+0.026107178 image pull 7e637240710437807d86f704ec92f4417e40d6b1f76088848cab504c91655fe5 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 27 17:06:47 compute-0 python3[184551]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498 --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 27 17:06:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:06:47.078 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:06:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:06:47.079 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:06:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:06:47.079 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:06:47 compute-0 sudo[184548]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:47 compute-0 sudo[184777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obsgpbyrwzppwalujrvtogzqyipikazk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212007.4166172-1320-82629564098586/AnsiballZ_stat.py'
Feb 27 17:06:47 compute-0 sudo[184777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:47 compute-0 python3.9[184780]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:06:47 compute-0 sudo[184777]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:48 compute-0 sudo[184932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjflcfvsmgefxyeiwmzmznicbsbrdqbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212008.269648-1329-185888293668004/AnsiballZ_file.py'
Feb 27 17:06:48 compute-0 sudo[184932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:48 compute-0 python3.9[184935]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:06:48 compute-0 sudo[184932]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:49 compute-0 sudo[185009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsddzsrhuyxtdvqtfoewbotfdofuohef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212008.269648-1329-185888293668004/AnsiballZ_stat.py'
Feb 27 17:06:49 compute-0 sudo[185009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:49 compute-0 python3.9[185012]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:06:49 compute-0 sudo[185009]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:50 compute-0 sudo[185161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzxkmagbwemeyeyicyrqlyuqkuyqkigo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212009.8736339-1329-24966956236024/AnsiballZ_copy.py'
Feb 27 17:06:50 compute-0 sudo[185161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:50 compute-0 python3.9[185164]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772212009.8736339-1329-24966956236024/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:06:50 compute-0 sudo[185161]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:50 compute-0 sudo[185238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmiyoxokkvjnndksjfqvemfidmpazprz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212009.8736339-1329-24966956236024/AnsiballZ_systemd.py'
Feb 27 17:06:50 compute-0 sudo[185238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:51 compute-0 python3.9[185241]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 17:06:51 compute-0 systemd[1]: Reloading.
Feb 27 17:06:51 compute-0 systemd-sysv-generator[185269]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:06:51 compute-0 systemd-rc-local-generator[185264]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:06:51 compute-0 sudo[185238]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:51 compute-0 sudo[185357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gizwupylsobdzormvdxoxxoswnftnzqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212009.8736339-1329-24966956236024/AnsiballZ_systemd.py'
Feb 27 17:06:51 compute-0 sudo[185357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:51 compute-0 python3.9[185360]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:06:52 compute-0 systemd[1]: Reloading.
Feb 27 17:06:52 compute-0 podman[185362]: 2026-02-27 17:06:52.074725233 +0000 UTC m=+0.079437648 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 27 17:06:52 compute-0 systemd-sysv-generator[185409]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:06:52 compute-0 systemd-rc-local-generator[185403]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:06:52 compute-0 systemd[1]: Starting nova_compute container...
Feb 27 17:06:52 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:06:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36deddf796226c3f491e0f739c2af942ace77532a5d53da7baca9a13f68997ec/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 27 17:06:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36deddf796226c3f491e0f739c2af942ace77532a5d53da7baca9a13f68997ec/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 27 17:06:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36deddf796226c3f491e0f739c2af942ace77532a5d53da7baca9a13f68997ec/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 27 17:06:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36deddf796226c3f491e0f739c2af942ace77532a5d53da7baca9a13f68997ec/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 27 17:06:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36deddf796226c3f491e0f739c2af942ace77532a5d53da7baca9a13f68997ec/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 27 17:06:52 compute-0 podman[185426]: 2026-02-27 17:06:52.451195853 +0000 UTC m=+0.148815500 container init b3e826e66e8361bcedd1ed1be07c8b5377156cfdce04f4d4f04579c1c037ec92 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 27 17:06:52 compute-0 podman[185426]: 2026-02-27 17:06:52.45970347 +0000 UTC m=+0.157323107 container start b3e826e66e8361bcedd1ed1be07c8b5377156cfdce04f4d4f04579c1c037ec92 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute)
Feb 27 17:06:52 compute-0 nova_compute[185441]: + sudo -E kolla_set_configs
Feb 27 17:06:52 compute-0 podman[185426]: nova_compute
Feb 27 17:06:52 compute-0 systemd[1]: Started nova_compute container.
Feb 27 17:06:52 compute-0 sudo[185357]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Validating config file
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Copying service configuration files
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Deleting /etc/ceph
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Creating directory /etc/ceph
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /etc/ceph
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Writing out command to execute
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 27 17:06:52 compute-0 nova_compute[185441]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 27 17:06:52 compute-0 nova_compute[185441]: ++ cat /run_command
Feb 27 17:06:52 compute-0 nova_compute[185441]: + CMD=nova-compute
Feb 27 17:06:52 compute-0 nova_compute[185441]: + ARGS=
Feb 27 17:06:52 compute-0 nova_compute[185441]: + sudo kolla_copy_cacerts
Feb 27 17:06:52 compute-0 nova_compute[185441]: + [[ ! -n '' ]]
Feb 27 17:06:52 compute-0 nova_compute[185441]: + . kolla_extend_start
Feb 27 17:06:52 compute-0 nova_compute[185441]: Running command: 'nova-compute'
Feb 27 17:06:52 compute-0 nova_compute[185441]: + echo 'Running command: '\''nova-compute'\'''
Feb 27 17:06:52 compute-0 nova_compute[185441]: + umask 0022
Feb 27 17:06:52 compute-0 nova_compute[185441]: + exec nova-compute
Feb 27 17:06:53 compute-0 python3.9[185602]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 27 17:06:54 compute-0 sudo[185753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmgokuzlbrivlxjgfqhscbtpcnewnyqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212013.7989035-1374-250965238126173/AnsiballZ_stat.py'
Feb 27 17:06:54 compute-0 sudo[185753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:54 compute-0 python3.9[185756]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:06:54 compute-0 nova_compute[185441]: 2026-02-27 17:06:54.319 185445 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 27 17:06:54 compute-0 nova_compute[185441]: 2026-02-27 17:06:54.320 185445 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 27 17:06:54 compute-0 nova_compute[185441]: 2026-02-27 17:06:54.320 185445 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 27 17:06:54 compute-0 nova_compute[185441]: 2026-02-27 17:06:54.320 185445 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 27 17:06:54 compute-0 sudo[185753]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:54 compute-0 nova_compute[185441]: 2026-02-27 17:06:54.446 185445 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:06:54 compute-0 nova_compute[185441]: 2026-02-27 17:06:54.457 185445 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:06:54 compute-0 nova_compute[185441]: 2026-02-27 17:06:54.457 185445 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 27 17:06:54 compute-0 sudo[185883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abtgdzdtnvoktvnyhpjnupmvjlibrbzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212013.7989035-1374-250965238126173/AnsiballZ_copy.py'
Feb 27 17:06:54 compute-0 sudo[185883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:54 compute-0 podman[185885]: 2026-02-27 17:06:54.825080138 +0000 UTC m=+0.093016010 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:06:54 compute-0 python3.9[185887]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772212013.7989035-1374-250965238126173/.source.yaml _original_basename=._ygg4vpa follow=False checksum=a8ad15aa4407292ada72618c4c05b9188ce2e3f5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:06:54 compute-0 sudo[185883]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.076 185445 INFO nova.virt.driver [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.210 185445 INFO nova.compute.provider_config [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.226 185445 DEBUG oslo_concurrency.lockutils [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.226 185445 DEBUG oslo_concurrency.lockutils [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.226 185445 DEBUG oslo_concurrency.lockutils [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.227 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.227 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.227 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.227 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.227 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.228 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.228 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.228 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.228 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.228 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.228 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.228 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.229 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.229 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.229 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.229 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.229 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.229 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.230 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.230 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.230 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.230 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.230 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.230 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.231 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.231 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.231 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.231 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.231 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.231 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.232 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.232 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.232 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.232 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.232 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.232 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.232 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.233 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.233 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.233 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.233 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.233 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.234 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.234 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.234 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.234 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.234 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.234 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.234 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.235 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.235 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.235 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.235 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.235 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.236 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.236 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.236 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.236 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.236 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.236 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.237 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.237 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.237 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.238 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.238 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.238 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.238 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.238 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.239 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.239 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.239 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.239 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.239 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.239 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.240 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.240 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.240 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.240 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.240 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.240 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.240 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.241 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.241 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.241 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.241 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.241 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.241 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.242 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.242 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.242 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.242 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.242 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.242 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.242 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.242 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.243 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.243 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.243 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.243 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.243 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.243 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.243 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.244 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.244 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.244 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.244 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.244 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.244 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.244 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.245 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.245 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.245 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.245 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.245 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.245 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.245 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.246 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.246 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.246 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.246 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.246 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.246 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.246 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.246 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.247 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.247 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.247 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.247 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.247 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.247 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.247 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.248 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.248 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.248 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.248 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.248 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.248 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.248 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.248 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.249 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.249 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.249 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.249 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.249 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.249 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.249 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.250 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.250 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.250 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.250 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.250 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.250 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.251 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.251 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.251 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.251 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.251 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.251 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.251 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.252 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.252 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.252 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.252 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.252 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.252 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.252 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.253 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.253 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.253 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.253 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.253 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.253 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.253 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.253 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.254 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.254 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.254 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.254 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.254 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.254 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.255 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.255 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.255 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.255 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.255 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.255 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.256 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.256 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.256 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.256 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.256 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.256 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.257 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.257 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.257 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.257 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.257 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.257 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.258 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.258 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.258 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.258 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.258 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.258 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.258 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.259 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.259 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.259 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.259 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.259 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.259 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.259 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.260 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.260 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.260 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.260 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.260 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.260 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.260 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.260 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.261 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.261 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.261 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.261 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.261 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.261 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.262 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.262 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.262 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.262 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.262 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.262 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.262 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.263 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.263 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.263 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.263 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.263 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.263 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.264 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.264 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.264 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.264 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.264 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.265 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.265 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.265 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.265 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.265 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.266 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.266 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.266 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.266 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.266 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.267 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.267 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.267 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.267 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.267 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.268 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.268 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.268 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.268 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.268 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.269 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.269 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.269 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.269 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.269 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.269 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.270 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.270 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.270 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.270 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.270 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.271 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.271 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.271 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.271 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.271 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.272 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.272 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.272 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.272 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.272 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.273 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.273 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.273 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.273 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.273 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.274 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.274 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.274 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.274 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.274 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.274 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.275 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.275 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.275 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.275 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.275 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.276 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.276 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.276 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.276 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.276 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.276 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.277 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.277 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.277 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.277 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.277 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.278 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.278 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.278 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.278 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.278 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.278 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.278 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.279 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.279 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.279 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.279 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.279 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.279 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.280 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.280 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.280 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.280 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.280 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.280 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.280 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.280 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.281 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.281 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.281 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.281 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.281 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.281 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.281 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.282 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.282 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.282 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.282 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.282 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.282 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.282 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.283 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.283 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.283 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.283 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.283 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.283 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.283 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.284 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.284 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.284 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.284 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.284 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.284 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.285 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.285 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.285 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.285 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.285 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.285 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.285 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.286 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.286 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.286 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.286 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.286 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.286 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.286 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.286 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.287 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.287 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.287 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.287 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.287 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.287 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.287 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.288 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.288 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.288 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.288 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.288 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.288 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.288 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.288 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.289 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.289 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.289 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.289 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.289 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.289 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.289 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.290 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.290 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.290 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.290 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.290 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.290 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.290 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.290 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.291 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.291 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.291 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.291 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.291 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.291 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.291 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.292 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.292 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.292 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.292 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.292 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.292 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.292 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.292 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.293 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.293 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.293 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.293 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.293 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.293 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.293 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.294 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.294 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.294 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.294 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.294 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.294 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.294 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.294 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.295 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.295 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.295 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.295 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.295 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.295 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.295 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.295 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.296 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.296 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.296 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.296 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.296 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.296 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.297 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.297 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.297 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.297 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.297 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.297 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.297 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.297 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.298 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.298 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.298 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.298 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.298 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.298 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.298 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.299 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.299 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.299 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.299 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.299 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.299 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.299 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.299 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.300 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.300 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.300 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.300 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.300 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.300 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.300 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.301 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.301 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.301 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.301 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.301 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.301 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.301 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.301 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.302 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.302 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.302 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.302 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.302 185445 WARNING oslo_config.cfg [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 27 17:06:55 compute-0 nova_compute[185441]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 27 17:06:55 compute-0 nova_compute[185441]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 27 17:06:55 compute-0 nova_compute[185441]: and ``live_migration_inbound_addr`` respectively.
Feb 27 17:06:55 compute-0 nova_compute[185441]: ).  Its value may be silently ignored in the future.
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.302 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.303 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.303 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.303 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.303 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.303 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.303 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.303 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.304 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.304 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.304 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.304 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.304 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.304 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.304 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.305 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.305 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.305 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.305 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.305 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.305 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.305 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.305 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.306 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.306 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.306 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.306 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.306 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.306 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.306 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.307 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.307 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.307 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.307 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.307 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.307 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.307 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.308 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.308 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.308 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.308 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.308 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.308 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.308 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.308 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.309 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.309 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.309 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.309 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.309 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.309 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.309 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.310 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.310 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.310 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.310 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.310 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.310 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.311 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.311 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.311 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.311 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.311 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.311 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.312 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.312 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.312 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.312 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.312 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.312 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.313 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.313 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.313 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.313 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.313 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.313 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.313 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.314 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.314 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.314 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.314 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.314 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.314 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.315 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.315 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.315 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.315 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.315 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.315 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.316 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.316 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.316 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.316 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.316 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.316 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.317 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.317 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.317 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.317 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.317 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.317 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.318 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.318 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.318 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.318 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.318 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.318 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.319 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.319 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.319 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.319 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.319 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.319 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.319 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.320 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.320 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.320 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.320 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.320 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.320 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.321 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.321 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.321 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.321 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.322 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.322 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.322 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.322 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.322 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.322 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.323 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.323 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.323 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.323 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.323 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.324 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.324 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.324 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.324 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.325 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.325 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.325 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.325 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.325 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.325 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.326 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.326 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.326 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.326 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.326 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.326 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.327 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.327 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.327 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.327 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.327 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.327 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.328 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.328 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.328 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.328 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.328 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.329 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.329 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.329 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.329 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.329 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.330 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.330 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.330 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.330 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.330 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.330 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.331 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.331 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.331 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.331 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.331 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.331 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.332 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.332 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.332 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.332 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.333 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.333 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.333 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.333 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.333 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.334 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.334 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.334 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.334 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.334 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.334 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.335 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.335 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.335 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.335 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.336 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.336 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.336 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.336 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.336 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.337 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.337 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.337 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.337 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.337 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.338 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.338 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.338 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.338 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.338 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.338 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.339 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.339 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.339 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.339 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.339 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.340 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.340 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.340 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.340 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.340 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.341 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.341 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.341 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.341 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.341 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.342 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.342 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.342 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.342 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.343 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.343 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.343 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.343 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.343 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.343 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.344 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.344 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.344 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.344 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.344 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.345 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.345 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.345 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.345 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.346 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.346 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.346 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.346 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.346 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.347 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.347 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.347 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.347 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.347 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.348 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.348 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.348 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.348 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.348 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.348 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.349 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.349 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.349 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.350 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.350 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.350 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.351 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.351 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.351 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.352 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.352 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.352 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.353 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.353 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.353 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.354 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.354 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.354 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.354 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.354 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.354 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.355 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.355 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.355 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.355 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.355 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.355 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.355 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.356 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.356 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.356 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.356 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.356 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.356 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.356 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.357 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.357 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.357 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.357 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.357 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.357 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.358 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.358 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.358 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.358 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.358 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.358 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.358 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.359 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.359 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.359 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.359 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.359 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.359 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.359 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.359 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.360 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.360 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.360 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.360 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.360 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.360 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.360 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.361 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.361 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.361 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.361 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.361 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.361 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.361 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.362 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.362 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.362 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.362 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.362 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.362 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.362 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.362 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.363 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.363 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.363 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.363 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.363 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.363 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.363 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.364 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.364 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.364 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.364 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.364 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.364 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.364 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.364 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.365 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.365 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.365 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.365 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.365 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.365 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.365 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.366 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.366 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.366 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.366 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.366 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.366 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.366 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.366 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.367 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.367 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.367 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.367 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.367 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.367 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.367 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.367 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.368 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.368 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.368 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.368 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.368 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.368 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.368 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.369 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.369 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.369 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.369 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.369 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.369 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.369 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.370 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.370 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.370 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.370 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.370 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.370 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.370 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.370 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.371 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.371 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.371 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.371 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.371 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.371 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.371 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.372 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.372 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.372 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.372 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.372 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.372 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.372 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.372 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.373 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.373 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.373 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.373 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.373 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.373 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.373 185445 DEBUG oslo_service.service [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.374 185445 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.391 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.391 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.392 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.392 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 27 17:06:55 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Feb 27 17:06:55 compute-0 systemd[1]: Started libvirt QEMU daemon.
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.458 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f16c2c1ac40> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.461 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f16c2c1ac40> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.462 185445 INFO nova.virt.libvirt.driver [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Connection event '1' reason 'None'
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.491 185445 WARNING nova.virt.libvirt.driver [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 27 17:06:55 compute-0 nova_compute[185441]: 2026-02-27 17:06:55.491 185445 DEBUG nova.virt.libvirt.volume.mount [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 27 17:06:55 compute-0 python3.9[186115]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.363 185445 INFO nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Libvirt host capabilities <capabilities>
Feb 27 17:06:56 compute-0 nova_compute[185441]: 
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <host>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <uuid>1b296e36-37ac-4d9b-9bcd-e79d835197c3</uuid>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <cpu>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <arch>x86_64</arch>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model>EPYC-Rome-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <vendor>AMD</vendor>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <microcode version='16777317'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <signature family='23' model='49' stepping='0'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='x2apic'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='tsc-deadline'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='osxsave'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='hypervisor'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='tsc_adjust'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='spec-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='stibp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='arch-capabilities'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='cmp_legacy'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='topoext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='virt-ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='lbrv'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='tsc-scale'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='vmcb-clean'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='pause-filter'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='pfthreshold'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='svme-addr-chk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='rdctl-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='skip-l1dfl-vmentry'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='mds-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature name='pschange-mc-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <pages unit='KiB' size='4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <pages unit='KiB' size='2048'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <pages unit='KiB' size='1048576'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </cpu>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <power_management>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <suspend_mem/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <suspend_disk/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <suspend_hybrid/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </power_management>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <iommu support='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <migration_features>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <live/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <uri_transports>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <uri_transport>tcp</uri_transport>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <uri_transport>rdma</uri_transport>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </uri_transports>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </migration_features>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <topology>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <cells num='1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <cell id='0'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:           <memory unit='KiB'>7864276</memory>
Feb 27 17:06:56 compute-0 nova_compute[185441]:           <pages unit='KiB' size='4'>1966069</pages>
Feb 27 17:06:56 compute-0 nova_compute[185441]:           <pages unit='KiB' size='2048'>0</pages>
Feb 27 17:06:56 compute-0 nova_compute[185441]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 27 17:06:56 compute-0 nova_compute[185441]:           <distances>
Feb 27 17:06:56 compute-0 nova_compute[185441]:             <sibling id='0' value='10'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:           </distances>
Feb 27 17:06:56 compute-0 nova_compute[185441]:           <cpus num='8'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:           </cpus>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         </cell>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </cells>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </topology>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <cache>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </cache>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <secmodel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model>selinux</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <doi>0</doi>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </secmodel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <secmodel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model>dac</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <doi>0</doi>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </secmodel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </host>
Feb 27 17:06:56 compute-0 nova_compute[185441]: 
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <guest>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <os_type>hvm</os_type>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <arch name='i686'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <wordsize>32</wordsize>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <domain type='qemu'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <domain type='kvm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </arch>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <features>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <pae/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <nonpae/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <acpi default='on' toggle='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <apic default='on' toggle='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <cpuselection/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <deviceboot/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <disksnapshot default='on' toggle='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <externalSnapshot/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </features>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </guest>
Feb 27 17:06:56 compute-0 nova_compute[185441]: 
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <guest>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <os_type>hvm</os_type>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <arch name='x86_64'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <wordsize>64</wordsize>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <domain type='qemu'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <domain type='kvm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </arch>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <features>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <acpi default='on' toggle='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <apic default='on' toggle='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <cpuselection/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <deviceboot/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <disksnapshot default='on' toggle='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <externalSnapshot/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </features>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </guest>
Feb 27 17:06:56 compute-0 nova_compute[185441]: 
Feb 27 17:06:56 compute-0 nova_compute[185441]: </capabilities>
Feb 27 17:06:56 compute-0 nova_compute[185441]: 
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.372 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.392 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 27 17:06:56 compute-0 nova_compute[185441]: <domainCapabilities>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <path>/usr/libexec/qemu-kvm</path>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <domain>kvm</domain>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <arch>i686</arch>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <vcpu max='4096'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <iothreads supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <os supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <enum name='firmware'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <loader supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>rom</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pflash</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='readonly'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>yes</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>no</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='secure'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>no</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </loader>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </os>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <cpu>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='host-passthrough' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='hostPassthroughMigratable'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>on</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>off</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='maximum' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='maximumMigratable'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>on</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>off</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='host-model' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <vendor>AMD</vendor>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='x2apic'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc-deadline'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='hypervisor'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc_adjust'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='spec-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='stibp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='cmp_legacy'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='overflow-recov'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='succor'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='amd-ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='virt-ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='lbrv'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc-scale'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='vmcb-clean'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='flushbyasid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='pause-filter'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='pfthreshold'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='svme-addr-chk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='disable' name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='custom' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='ClearwaterForest'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ddpd-u'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sha512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm3'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='ClearwaterForest-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ddpd-u'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sha512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm3'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Dhyana-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Turin'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibpb-brtype'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbpb'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Turin-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibpb-brtype'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbpb'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-128'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-256'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-128'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-256'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v6'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v7'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='KnightsMill'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4fmaps'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4vnniw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512er'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512pf'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='KnightsMill-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4fmaps'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4vnniw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512er'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512pf'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G4-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tbm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G5-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tbm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='athlon'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='athlon-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='core2duo'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='core2duo-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='coreduo'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='coreduo-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='n270'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='n270-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='phenom'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='phenom-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </cpu>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <memoryBacking supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <enum name='sourceType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>file</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>anonymous</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>memfd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </memoryBacking>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <devices>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <disk supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='diskDevice'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>disk</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>cdrom</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>floppy</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>lun</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='bus'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>fdc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>scsi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>sata</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-non-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </disk>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <graphics supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vnc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>egl-headless</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dbus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </graphics>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <video supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='modelType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vga</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>cirrus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>none</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>bochs</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>ramfb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </video>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <hostdev supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='mode'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>subsystem</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='startupPolicy'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>default</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>mandatory</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>requisite</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>optional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='subsysType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pci</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>scsi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='capsType'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='pciBackend'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </hostdev>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <rng supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-non-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>random</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>egd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>builtin</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </rng>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <filesystem supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='driverType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>path</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>handle</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtiofs</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </filesystem>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <tpm supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tpm-tis</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tpm-crb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>emulator</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>external</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendVersion'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>2.0</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </tpm>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <redirdev supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='bus'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </redirdev>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <channel supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pty</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>unix</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </channel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <crypto supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>qemu</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>builtin</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </crypto>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <interface supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>default</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>passt</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </interface>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <panic supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>isa</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>hyperv</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </panic>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <console supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>null</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pty</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dev</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>file</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pipe</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>stdio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>udp</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tcp</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>unix</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>qemu-vdagent</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dbus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </console>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </devices>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <features>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <gic supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <vmcoreinfo supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <genid supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <backingStoreInput supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <backup supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <async-teardown supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <s390-pv supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <ps2 supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <tdx supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <sev supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <sgx supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <hyperv supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='features'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>relaxed</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vapic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>spinlocks</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vpindex</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>runtime</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>synic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>stimer</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>reset</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vendor_id</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>frequencies</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>reenlightenment</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tlbflush</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>ipi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>avic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>emsr_bitmap</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>xmm_input</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <defaults>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <spinlocks>4095</spinlocks>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <stimer_direct>on</stimer_direct>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <tlbflush_direct>on</tlbflush_direct>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <tlbflush_extended>on</tlbflush_extended>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </defaults>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </hyperv>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <launchSecurity supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </features>
Feb 27 17:06:56 compute-0 nova_compute[185441]: </domainCapabilities>
Feb 27 17:06:56 compute-0 nova_compute[185441]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.399 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 27 17:06:56 compute-0 nova_compute[185441]: <domainCapabilities>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <path>/usr/libexec/qemu-kvm</path>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <domain>kvm</domain>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <arch>i686</arch>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <vcpu max='240'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <iothreads supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <os supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <enum name='firmware'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <loader supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>rom</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pflash</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='readonly'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>yes</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>no</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='secure'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>no</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </loader>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </os>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <cpu>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='host-passthrough' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='hostPassthroughMigratable'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>on</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>off</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='maximum' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='maximumMigratable'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>on</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>off</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='host-model' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <vendor>AMD</vendor>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='x2apic'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc-deadline'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='hypervisor'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc_adjust'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='spec-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='stibp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='cmp_legacy'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='overflow-recov'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='succor'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='amd-ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='virt-ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='lbrv'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc-scale'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='vmcb-clean'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='flushbyasid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='pause-filter'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='pfthreshold'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='svme-addr-chk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='disable' name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='custom' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='ClearwaterForest'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ddpd-u'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sha512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm3'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='ClearwaterForest-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ddpd-u'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sha512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm3'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Dhyana-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Turin'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibpb-brtype'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbpb'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Turin-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibpb-brtype'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbpb'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-128'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-256'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-128'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-256'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v6'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v7'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='KnightsMill'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4fmaps'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4vnniw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512er'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512pf'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='KnightsMill-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4fmaps'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4vnniw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512er'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512pf'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G4-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tbm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G5-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tbm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='athlon'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='athlon-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='core2duo'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='core2duo-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='coreduo'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='coreduo-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='n270'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='n270-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='phenom'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='phenom-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </cpu>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <memoryBacking supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <enum name='sourceType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>file</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>anonymous</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>memfd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </memoryBacking>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <devices>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <disk supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='diskDevice'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>disk</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>cdrom</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>floppy</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>lun</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='bus'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>ide</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>fdc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>scsi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>sata</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-non-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </disk>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <graphics supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vnc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>egl-headless</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dbus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </graphics>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <video supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='modelType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vga</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>cirrus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>none</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>bochs</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>ramfb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </video>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <hostdev supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='mode'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>subsystem</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='startupPolicy'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>default</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>mandatory</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>requisite</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>optional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='subsysType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pci</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>scsi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='capsType'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='pciBackend'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </hostdev>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <rng supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-non-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>random</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>egd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>builtin</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </rng>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <filesystem supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='driverType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>path</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>handle</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtiofs</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </filesystem>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <tpm supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tpm-tis</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tpm-crb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>emulator</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>external</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendVersion'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>2.0</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </tpm>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <redirdev supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='bus'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </redirdev>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <channel supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pty</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>unix</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </channel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <crypto supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>qemu</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>builtin</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </crypto>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <interface supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>default</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>passt</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </interface>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <panic supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>isa</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>hyperv</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </panic>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <console supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>null</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pty</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dev</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>file</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pipe</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>stdio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>udp</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tcp</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>unix</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>qemu-vdagent</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dbus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </console>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </devices>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <features>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <gic supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <vmcoreinfo supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <genid supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <backingStoreInput supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <backup supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <async-teardown supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <s390-pv supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <ps2 supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <tdx supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <sev supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <sgx supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <hyperv supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='features'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>relaxed</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vapic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>spinlocks</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vpindex</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>runtime</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>synic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>stimer</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>reset</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vendor_id</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>frequencies</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>reenlightenment</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tlbflush</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>ipi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>avic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>emsr_bitmap</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>xmm_input</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <defaults>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <spinlocks>4095</spinlocks>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <stimer_direct>on</stimer_direct>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <tlbflush_direct>on</tlbflush_direct>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <tlbflush_extended>on</tlbflush_extended>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </defaults>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </hyperv>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <launchSecurity supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </features>
Feb 27 17:06:56 compute-0 nova_compute[185441]: </domainCapabilities>
Feb 27 17:06:56 compute-0 nova_compute[185441]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.462 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.467 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 27 17:06:56 compute-0 nova_compute[185441]: <domainCapabilities>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <path>/usr/libexec/qemu-kvm</path>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <domain>kvm</domain>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <arch>x86_64</arch>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <vcpu max='4096'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <iothreads supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <os supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <enum name='firmware'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>efi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <loader supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>rom</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pflash</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='readonly'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>yes</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>no</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='secure'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>yes</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>no</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </loader>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </os>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <cpu>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='host-passthrough' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='hostPassthroughMigratable'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>on</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>off</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='maximum' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='maximumMigratable'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>on</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>off</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='host-model' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <vendor>AMD</vendor>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='x2apic'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc-deadline'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='hypervisor'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc_adjust'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='spec-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='stibp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='cmp_legacy'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='overflow-recov'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='succor'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='amd-ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='virt-ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='lbrv'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc-scale'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='vmcb-clean'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='flushbyasid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='pause-filter'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='pfthreshold'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='svme-addr-chk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='disable' name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='custom' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='ClearwaterForest'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ddpd-u'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sha512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm3'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='ClearwaterForest-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ddpd-u'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sha512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm3'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Dhyana-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Turin'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibpb-brtype'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbpb'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Turin-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibpb-brtype'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbpb'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-128'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-256'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-128'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-256'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v6'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v7'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='KnightsMill'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4fmaps'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4vnniw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512er'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512pf'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='KnightsMill-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4fmaps'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4vnniw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512er'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512pf'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G4-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tbm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G5-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tbm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='athlon'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='athlon-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='core2duo'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='core2duo-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='coreduo'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='coreduo-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='n270'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='n270-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='phenom'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='phenom-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </cpu>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <memoryBacking supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <enum name='sourceType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>file</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>anonymous</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>memfd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </memoryBacking>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <devices>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <disk supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='diskDevice'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>disk</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>cdrom</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>floppy</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>lun</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='bus'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>fdc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>scsi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>sata</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-non-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </disk>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <graphics supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vnc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>egl-headless</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dbus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </graphics>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <video supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='modelType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vga</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>cirrus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>none</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>bochs</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>ramfb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </video>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <hostdev supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='mode'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>subsystem</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='startupPolicy'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>default</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>mandatory</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>requisite</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>optional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='subsysType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pci</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>scsi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='capsType'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='pciBackend'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </hostdev>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <rng supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-non-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>random</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>egd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>builtin</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </rng>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <filesystem supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='driverType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>path</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>handle</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtiofs</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </filesystem>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <tpm supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tpm-tis</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tpm-crb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>emulator</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>external</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendVersion'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>2.0</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </tpm>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <redirdev supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='bus'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </redirdev>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <channel supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pty</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>unix</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </channel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <crypto supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>qemu</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>builtin</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </crypto>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <interface supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>default</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>passt</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </interface>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <panic supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>isa</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>hyperv</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </panic>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <console supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>null</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pty</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dev</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>file</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pipe</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>stdio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>udp</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tcp</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>unix</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>qemu-vdagent</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dbus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </console>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </devices>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <features>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <gic supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <vmcoreinfo supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <genid supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <backingStoreInput supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <backup supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <async-teardown supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <s390-pv supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <ps2 supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <tdx supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <sev supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <sgx supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <hyperv supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='features'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>relaxed</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vapic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>spinlocks</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vpindex</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>runtime</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>synic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>stimer</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>reset</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vendor_id</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>frequencies</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>reenlightenment</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tlbflush</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>ipi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>avic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>emsr_bitmap</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>xmm_input</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <defaults>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <spinlocks>4095</spinlocks>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <stimer_direct>on</stimer_direct>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <tlbflush_direct>on</tlbflush_direct>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <tlbflush_extended>on</tlbflush_extended>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </defaults>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </hyperv>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <launchSecurity supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </features>
Feb 27 17:06:56 compute-0 nova_compute[185441]: </domainCapabilities>
Feb 27 17:06:56 compute-0 nova_compute[185441]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.569 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 27 17:06:56 compute-0 nova_compute[185441]: <domainCapabilities>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <path>/usr/libexec/qemu-kvm</path>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <domain>kvm</domain>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <arch>x86_64</arch>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <vcpu max='240'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <iothreads supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <os supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <enum name='firmware'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <loader supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>rom</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pflash</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='readonly'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>yes</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>no</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='secure'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>no</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </loader>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </os>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <cpu>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='host-passthrough' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='hostPassthroughMigratable'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>on</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>off</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='maximum' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='maximumMigratable'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>on</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>off</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='host-model' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <vendor>AMD</vendor>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='x2apic'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc-deadline'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='hypervisor'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc_adjust'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='spec-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='stibp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='cmp_legacy'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='overflow-recov'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='succor'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='amd-ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='virt-ssbd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='lbrv'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='tsc-scale'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='vmcb-clean'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='flushbyasid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='pause-filter'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='pfthreshold'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='svme-addr-chk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <feature policy='disable' name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <mode name='custom' supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Broadwell-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cascadelake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='ClearwaterForest'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ddpd-u'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sha512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm3'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='ClearwaterForest-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ddpd-u'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sha512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm3'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sm4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Cooperlake-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Denverton-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Dhyana-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Genoa-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Milan-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Rome-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Turin'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibpb-brtype'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbpb'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-Turin-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amd-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='auto-ibrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibpb-brtype'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='no-nested-data-bp'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='null-sel-clr-base'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='perfmon-v2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbpb'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='stibp-always-on'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='EPYC-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-128'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-256'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='GraniteRapids-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-128'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-256'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx10-512'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='prefetchiti'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Haswell-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-noTSX'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v6'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Icelake-Server-v7'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='IvyBridge-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='KnightsMill'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4fmaps'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4vnniw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512er'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512pf'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='KnightsMill-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4fmaps'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-4vnniw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512er'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512pf'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G4-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tbm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Opteron_G5-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fma4'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tbm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xop'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SapphireRapids-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='amx-tile'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-bf16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-fp16'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bitalg'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vbmi2'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrc'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fzrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='la57'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='taa-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='tsx-ldtrk'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='SierraForest-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ifma'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-ne-convert'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx-vnni-int8'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bhi-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='bus-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cmpccxadd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fbsdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='fsrs'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ibrs-all'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='intel-psfd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ipred-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='lam'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mcdt-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pbrsb-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='psdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rrsba-ctrl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='serialize'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vaes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='vpclmulqdq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Client-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='hle'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='rtm'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Skylake-Server-v5'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512bw'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512cd'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512dq'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512f'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='avx512vl'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='invpcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pcid'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='pku'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='mpx'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v2'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v3'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='core-capability'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='split-lock-detect'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='Snowridge-v4'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='cldemote'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='erms'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='gfni'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdir64b'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='movdiri'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='xsaves'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='athlon'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='athlon-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='core2duo'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='core2duo-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='coreduo'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='coreduo-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='n270'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='n270-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='ss'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='phenom'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <blockers model='phenom-v1'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnow'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <feature name='3dnowext'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </blockers>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </mode>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </cpu>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <memoryBacking supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <enum name='sourceType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>file</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>anonymous</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <value>memfd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </memoryBacking>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <devices>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <disk supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='diskDevice'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>disk</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>cdrom</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>floppy</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>lun</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='bus'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>ide</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>fdc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>scsi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>sata</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-non-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </disk>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <graphics supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vnc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>egl-headless</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dbus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </graphics>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <video supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='modelType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vga</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>cirrus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>none</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>bochs</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>ramfb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </video>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <hostdev supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='mode'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>subsystem</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='startupPolicy'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>default</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>mandatory</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>requisite</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>optional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='subsysType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pci</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>scsi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='capsType'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='pciBackend'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </hostdev>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <rng supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtio-non-transitional</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>random</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>egd</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>builtin</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </rng>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <filesystem supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='driverType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>path</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>handle</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>virtiofs</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </filesystem>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <tpm supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tpm-tis</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tpm-crb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>emulator</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>external</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendVersion'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>2.0</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </tpm>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <redirdev supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='bus'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>usb</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </redirdev>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <channel supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pty</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>unix</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </channel>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <crypto supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>qemu</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendModel'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>builtin</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </crypto>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <interface supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='backendType'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>default</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>passt</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </interface>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <panic supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='model'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>isa</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>hyperv</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </panic>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <console supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='type'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>null</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vc</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pty</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dev</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>file</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>pipe</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>stdio</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>udp</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tcp</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>unix</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>qemu-vdagent</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>dbus</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </console>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </devices>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   <features>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <gic supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <vmcoreinfo supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <genid supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <backingStoreInput supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <backup supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <async-teardown supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <s390-pv supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <ps2 supported='yes'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <tdx supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <sev supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <sgx supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <hyperv supported='yes'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <enum name='features'>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>relaxed</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vapic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>spinlocks</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vpindex</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>runtime</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>synic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>stimer</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>reset</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>vendor_id</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>frequencies</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>reenlightenment</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>tlbflush</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>ipi</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>avic</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>emsr_bitmap</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <value>xmm_input</value>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </enum>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       <defaults>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <spinlocks>4095</spinlocks>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <stimer_direct>on</stimer_direct>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <tlbflush_direct>on</tlbflush_direct>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <tlbflush_extended>on</tlbflush_extended>
Feb 27 17:06:56 compute-0 nova_compute[185441]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 27 17:06:56 compute-0 nova_compute[185441]:       </defaults>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     </hyperv>
Feb 27 17:06:56 compute-0 nova_compute[185441]:     <launchSecurity supported='no'/>
Feb 27 17:06:56 compute-0 nova_compute[185441]:   </features>
Feb 27 17:06:56 compute-0 nova_compute[185441]: </domainCapabilities>
Feb 27 17:06:56 compute-0 nova_compute[185441]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.638 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.639 185445 INFO nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Secure Boot support detected
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.641 185445 INFO nova.virt.libvirt.driver [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.641 185445 INFO nova.virt.libvirt.driver [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.652 185445 DEBUG nova.virt.libvirt.driver [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.711 185445 INFO nova.virt.node [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Determined node identity 2b4df47a-58ba-41db-b94b-eb594c2f9699 from /var/lib/nova/compute_id
Feb 27 17:06:56 compute-0 python3.9[186277]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.740 185445 WARNING nova.compute.manager [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Compute nodes ['2b4df47a-58ba-41db-b94b-eb594c2f9699'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.795 185445 INFO nova.compute.manager [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.842 185445 WARNING nova.compute.manager [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.843 185445 DEBUG oslo_concurrency.lockutils [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.843 185445 DEBUG oslo_concurrency.lockutils [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.843 185445 DEBUG oslo_concurrency.lockutils [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:06:56 compute-0 nova_compute[185441]: 2026-02-27 17:06:56.843 185445 DEBUG nova.compute.resource_tracker [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:06:56 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Feb 27 17:06:56 compute-0 systemd[1]: Started libvirt nodedev daemon.
Feb 27 17:06:57 compute-0 nova_compute[185441]: 2026-02-27 17:06:57.166 185445 WARNING nova.virt.libvirt.driver [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:06:57 compute-0 nova_compute[185441]: 2026-02-27 17:06:57.167 185445 DEBUG nova.compute.resource_tracker [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6171MB free_disk=73.42586135864258GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:06:57 compute-0 nova_compute[185441]: 2026-02-27 17:06:57.167 185445 DEBUG oslo_concurrency.lockutils [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:06:57 compute-0 nova_compute[185441]: 2026-02-27 17:06:57.167 185445 DEBUG oslo_concurrency.lockutils [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:06:57 compute-0 nova_compute[185441]: 2026-02-27 17:06:57.206 185445 WARNING nova.compute.resource_tracker [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] No compute node record for compute-0.ctlplane.example.com:2b4df47a-58ba-41db-b94b-eb594c2f9699: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 2b4df47a-58ba-41db-b94b-eb594c2f9699 could not be found.
Feb 27 17:06:57 compute-0 nova_compute[185441]: 2026-02-27 17:06:57.258 185445 INFO nova.compute.resource_tracker [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 2b4df47a-58ba-41db-b94b-eb594c2f9699
Feb 27 17:06:57 compute-0 nova_compute[185441]: 2026-02-27 17:06:57.379 185445 DEBUG nova.compute.resource_tracker [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:06:57 compute-0 nova_compute[185441]: 2026-02-27 17:06:57.379 185445 DEBUG nova.compute.resource_tracker [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:06:57 compute-0 python3.9[186450]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.281 185445 INFO nova.scheduler.client.report [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] [req-9e23a11f-9ab8-4e2e-b099-9517b4070178] Created resource provider record via placement API for resource provider with UUID 2b4df47a-58ba-41db-b94b-eb594c2f9699 and name compute-0.ctlplane.example.com.
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.651 185445 DEBUG nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 27 17:06:58 compute-0 nova_compute[185441]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.651 185445 INFO nova.virt.libvirt.host [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] kernel doesn't support AMD SEV
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.652 185445 DEBUG nova.compute.provider_tree [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Updating inventory in ProviderTree for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.653 185445 DEBUG nova.virt.libvirt.driver [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:06:58 compute-0 sudo[186600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owrhnssfmncyxfziecslakaaedackasi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212018.1047168-1424-55791328146520/AnsiballZ_podman_container.py'
Feb 27 17:06:58 compute-0 sudo[186600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.721 185445 DEBUG nova.scheduler.client.report [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Updated inventory for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.723 185445 DEBUG nova.compute.provider_tree [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Updating resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.723 185445 DEBUG nova.compute.provider_tree [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Updating inventory in ProviderTree for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.867 185445 DEBUG nova.compute.provider_tree [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Updating resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.892 185445 DEBUG nova.compute.resource_tracker [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.893 185445 DEBUG oslo_concurrency.lockutils [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.725s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:06:58 compute-0 nova_compute[185441]: 2026-02-27 17:06:58.893 185445 DEBUG nova.service [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 27 17:06:58 compute-0 python3.9[186603]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 27 17:06:59 compute-0 nova_compute[185441]: 2026-02-27 17:06:59.047 185445 DEBUG nova.service [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 27 17:06:59 compute-0 nova_compute[185441]: 2026-02-27 17:06:59.048 185445 DEBUG nova.servicegroup.drivers.db [None req-4f576899-b5d4-4ad5-bd5a-7404ce11086b - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 27 17:06:59 compute-0 rsyslogd[1012]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 27 17:06:59 compute-0 rsyslogd[1012]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 27 17:06:59 compute-0 sudo[186600]: pam_unix(sudo:session): session closed for user root
Feb 27 17:06:59 compute-0 sudo[186775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtungorxpxmtpabqujkaxdzllgawfmyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212019.2861803-1432-4131308151779/AnsiballZ_systemd.py'
Feb 27 17:06:59 compute-0 sudo[186775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:06:59 compute-0 python3.9[186778]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 27 17:06:59 compute-0 systemd[1]: Stopping nova_compute container...
Feb 27 17:07:00 compute-0 nova_compute[185441]: 2026-02-27 17:07:00.094 185445 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 27 17:07:00 compute-0 nova_compute[185441]: 2026-02-27 17:07:00.097 185445 DEBUG oslo_concurrency.lockutils [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:07:00 compute-0 nova_compute[185441]: 2026-02-27 17:07:00.098 185445 DEBUG oslo_concurrency.lockutils [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:07:00 compute-0 nova_compute[185441]: 2026-02-27 17:07:00.098 185445 DEBUG oslo_concurrency.lockutils [None req-ba1000b0-94c5-42a3-bd6f-0f85f37f1eee - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:07:00 compute-0 virtqemud[186011]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 27 17:07:00 compute-0 virtqemud[186011]: hostname: compute-0
Feb 27 17:07:00 compute-0 virtqemud[186011]: End of file while reading data: Input/output error
Feb 27 17:07:00 compute-0 systemd[1]: libpod-b3e826e66e8361bcedd1ed1be07c8b5377156cfdce04f4d4f04579c1c037ec92.scope: Deactivated successfully.
Feb 27 17:07:00 compute-0 systemd[1]: libpod-b3e826e66e8361bcedd1ed1be07c8b5377156cfdce04f4d4f04579c1c037ec92.scope: Consumed 3.074s CPU time.
Feb 27 17:07:00 compute-0 conmon[185441]: conmon b3e826e66e8361bcedd1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b3e826e66e8361bcedd1ed1be07c8b5377156cfdce04f4d4f04579c1c037ec92.scope/container/memory.events
Feb 27 17:07:00 compute-0 podman[186782]: 2026-02-27 17:07:00.48749463 +0000 UTC m=+0.474357207 container died b3e826e66e8361bcedd1ed1be07c8b5377156cfdce04f4d4f04579c1c037ec92 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:07:00 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3e826e66e8361bcedd1ed1be07c8b5377156cfdce04f4d4f04579c1c037ec92-userdata-shm.mount: Deactivated successfully.
Feb 27 17:07:00 compute-0 systemd[1]: var-lib-containers-storage-overlay-36deddf796226c3f491e0f739c2af942ace77532a5d53da7baca9a13f68997ec-merged.mount: Deactivated successfully.
Feb 27 17:07:00 compute-0 podman[186782]: 2026-02-27 17:07:00.539833326 +0000 UTC m=+0.526695883 container cleanup b3e826e66e8361bcedd1ed1be07c8b5377156cfdce04f4d4f04579c1c037ec92 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 27 17:07:00 compute-0 podman[186782]: nova_compute
Feb 27 17:07:00 compute-0 podman[186811]: nova_compute
Feb 27 17:07:00 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 27 17:07:00 compute-0 systemd[1]: Stopped nova_compute container.
Feb 27 17:07:00 compute-0 systemd[1]: Starting nova_compute container...
Feb 27 17:07:00 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:07:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36deddf796226c3f491e0f739c2af942ace77532a5d53da7baca9a13f68997ec/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 27 17:07:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36deddf796226c3f491e0f739c2af942ace77532a5d53da7baca9a13f68997ec/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 27 17:07:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36deddf796226c3f491e0f739c2af942ace77532a5d53da7baca9a13f68997ec/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 27 17:07:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36deddf796226c3f491e0f739c2af942ace77532a5d53da7baca9a13f68997ec/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 27 17:07:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/36deddf796226c3f491e0f739c2af942ace77532a5d53da7baca9a13f68997ec/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 27 17:07:00 compute-0 podman[186824]: 2026-02-27 17:07:00.741051493 +0000 UTC m=+0.106653162 container init b3e826e66e8361bcedd1ed1be07c8b5377156cfdce04f4d4f04579c1c037ec92 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 27 17:07:00 compute-0 podman[186824]: 2026-02-27 17:07:00.752790889 +0000 UTC m=+0.118392488 container start b3e826e66e8361bcedd1ed1be07c8b5377156cfdce04f4d4f04579c1c037ec92 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855-9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 27 17:07:00 compute-0 podman[186824]: nova_compute
Feb 27 17:07:00 compute-0 nova_compute[186840]: + sudo -E kolla_set_configs
Feb 27 17:07:00 compute-0 systemd[1]: Started nova_compute container.
Feb 27 17:07:00 compute-0 sudo[186775]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Validating config file
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Copying service configuration files
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Copying /var/lib/kolla/config_files/src/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Deleting /etc/ceph
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Creating directory /etc/ceph
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /etc/ceph
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Writing out command to execute
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 27 17:07:00 compute-0 nova_compute[186840]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 27 17:07:00 compute-0 nova_compute[186840]: ++ cat /run_command
Feb 27 17:07:00 compute-0 nova_compute[186840]: + CMD=nova-compute
Feb 27 17:07:00 compute-0 nova_compute[186840]: + ARGS=
Feb 27 17:07:00 compute-0 nova_compute[186840]: + sudo kolla_copy_cacerts
Feb 27 17:07:00 compute-0 nova_compute[186840]: + [[ ! -n '' ]]
Feb 27 17:07:00 compute-0 nova_compute[186840]: + . kolla_extend_start
Feb 27 17:07:00 compute-0 nova_compute[186840]: Running command: 'nova-compute'
Feb 27 17:07:00 compute-0 nova_compute[186840]: + echo 'Running command: '\''nova-compute'\'''
Feb 27 17:07:00 compute-0 nova_compute[186840]: + umask 0022
Feb 27 17:07:00 compute-0 nova_compute[186840]: + exec nova-compute
Feb 27 17:07:01 compute-0 sudo[187001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyoahasehxopupvujclhdhxhoaxwkove ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212021.0901704-1441-64915316600436/AnsiballZ_podman_container.py'
Feb 27 17:07:01 compute-0 sudo[187001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:01 compute-0 python3.9[187004]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 27 17:07:01 compute-0 systemd[1]: Started libpod-conmon-f648cc0ff9f14ed6dce27979949913ea534bf6e5420fb2b5dbb5b689dfada157.scope.
Feb 27 17:07:01 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:07:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8882e2c9d54eec1a07a924eed72e3d58c99cb811260a57b965aa18380e20ce93/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 27 17:07:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8882e2c9d54eec1a07a924eed72e3d58c99cb811260a57b965aa18380e20ce93/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 27 17:07:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8882e2c9d54eec1a07a924eed72e3d58c99cb811260a57b965aa18380e20ce93/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 27 17:07:01 compute-0 podman[187029]: 2026-02-27 17:07:01.977020921 +0000 UTC m=+0.174372823 container init f648cc0ff9f14ed6dce27979949913ea534bf6e5420fb2b5dbb5b689dfada157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, org.label-schema.license=GPLv2)
Feb 27 17:07:01 compute-0 podman[187029]: 2026-02-27 17:07:01.985216141 +0000 UTC m=+0.182568023 container start f648cc0ff9f14ed6dce27979949913ea534bf6e5420fb2b5dbb5b689dfada157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Feb 27 17:07:01 compute-0 python3.9[187004]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Applying nova statedir ownership
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 27 17:07:02 compute-0 nova_compute_init[187051]: INFO:nova_statedir:Nova statedir ownership complete
Feb 27 17:07:02 compute-0 systemd[1]: libpod-f648cc0ff9f14ed6dce27979949913ea534bf6e5420fb2b5dbb5b689dfada157.scope: Deactivated successfully.
Feb 27 17:07:02 compute-0 podman[187065]: 2026-02-27 17:07:02.090325944 +0000 UTC m=+0.024720484 container died f648cc0ff9f14ed6dce27979949913ea534bf6e5420fb2b5dbb5b689dfada157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=nova_compute_init, container_name=nova_compute_init)
Feb 27 17:07:02 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f648cc0ff9f14ed6dce27979949913ea534bf6e5420fb2b5dbb5b689dfada157-userdata-shm.mount: Deactivated successfully.
Feb 27 17:07:02 compute-0 systemd[1]: var-lib-containers-storage-overlay-8882e2c9d54eec1a07a924eed72e3d58c99cb811260a57b965aa18380e20ce93-merged.mount: Deactivated successfully.
Feb 27 17:07:02 compute-0 podman[187065]: 2026-02-27 17:07:02.122594111 +0000 UTC m=+0.056988631 container cleanup f648cc0ff9f14ed6dce27979949913ea534bf6e5420fb2b5dbb5b689dfada157 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9a1869fba1d1fbcb85d890dece7267715285c88adb54e4746132cd3effc20498'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 27 17:07:02 compute-0 systemd[1]: libpod-conmon-f648cc0ff9f14ed6dce27979949913ea534bf6e5420fb2b5dbb5b689dfada157.scope: Deactivated successfully.
Feb 27 17:07:02 compute-0 sudo[187001]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:02 compute-0 nova_compute[186840]: 2026-02-27 17:07:02.552 186844 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 27 17:07:02 compute-0 nova_compute[186840]: 2026-02-27 17:07:02.553 186844 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 27 17:07:02 compute-0 nova_compute[186840]: 2026-02-27 17:07:02.553 186844 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 27 17:07:02 compute-0 nova_compute[186840]: 2026-02-27 17:07:02.553 186844 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 27 17:07:02 compute-0 sshd-session[161793]: Connection closed by 192.168.122.30 port 56574
Feb 27 17:07:02 compute-0 sshd-session[161790]: pam_unix(sshd:session): session closed for user zuul
Feb 27 17:07:02 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Feb 27 17:07:02 compute-0 systemd[1]: session-23.scope: Consumed 1min 44.142s CPU time.
Feb 27 17:07:02 compute-0 systemd-logind[803]: Session 23 logged out. Waiting for processes to exit.
Feb 27 17:07:02 compute-0 systemd-logind[803]: Removed session 23.
Feb 27 17:07:02 compute-0 nova_compute[186840]: 2026-02-27 17:07:02.667 186844 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:07:02 compute-0 nova_compute[186840]: 2026-02-27 17:07:02.690 186844 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:07:02 compute-0 nova_compute[186840]: 2026-02-27 17:07:02.690 186844 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.176 186844 INFO nova.virt.driver [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.306 186844 INFO nova.compute.provider_config [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.319 186844 DEBUG oslo_concurrency.lockutils [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.320 186844 DEBUG oslo_concurrency.lockutils [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.320 186844 DEBUG oslo_concurrency.lockutils [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.320 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.320 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.321 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.321 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.321 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.321 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.321 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.321 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.321 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.322 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.322 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.322 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.322 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.322 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.322 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.322 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.323 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.323 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.323 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.323 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.323 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.323 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.323 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.324 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.324 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.324 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.324 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.324 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.324 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.324 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.325 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.325 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.325 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.325 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.325 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.325 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.325 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.326 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.326 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.326 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.326 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.326 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.327 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.327 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.327 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.327 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.327 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.327 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.327 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.328 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.328 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.328 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.328 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.328 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.328 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.328 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.329 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.329 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.329 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.329 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.329 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.329 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.329 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.330 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.330 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.330 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.330 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.330 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.330 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.330 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.331 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.331 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.331 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.331 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.331 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.331 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.331 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.332 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.332 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.332 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.332 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.332 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.332 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.332 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.333 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.333 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.333 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.333 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.333 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.333 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.334 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.334 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.334 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.334 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.334 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.335 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.335 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.335 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.335 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.335 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.335 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.336 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.336 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.336 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.336 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.336 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.336 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.337 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.337 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.337 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.337 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.337 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.337 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.338 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.338 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.338 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.338 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.338 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.338 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.338 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.339 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.339 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.339 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.339 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.339 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.339 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.340 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.340 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.340 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.340 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.340 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.340 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.340 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.341 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.341 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.341 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.341 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.341 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.341 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.341 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.341 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.342 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.342 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.342 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.342 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.342 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.342 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.342 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.343 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.343 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.343 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.343 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.343 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.343 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.344 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.344 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.344 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.344 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.344 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.344 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.344 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.345 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.345 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.345 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.345 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.345 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.345 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.345 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.346 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.346 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.346 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.346 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.346 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.346 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.346 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.347 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.347 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.347 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.347 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.347 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.347 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.347 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.348 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.348 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.348 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.348 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.348 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.348 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.348 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.349 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.349 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.349 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.349 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.349 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.349 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.350 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.350 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.350 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.350 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.350 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.350 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.350 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.351 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.351 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.351 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.351 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.351 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.351 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.352 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.352 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.352 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.352 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.352 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.353 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.353 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.353 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.353 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.353 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.353 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.353 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.354 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.354 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.354 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.354 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.354 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.354 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.354 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.355 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.355 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.355 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.355 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.355 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.355 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.356 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.356 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.356 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.356 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.356 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.356 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.356 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.357 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.357 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.357 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.357 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.357 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.357 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.357 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.358 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.358 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.358 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.358 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.358 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.358 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.358 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.359 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.359 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.359 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.359 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.359 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.359 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.359 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.360 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.360 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.360 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.360 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.360 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.360 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.361 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.361 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.361 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.361 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.361 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.361 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.361 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.362 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.362 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.362 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.362 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.362 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.362 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.362 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.363 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.363 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.363 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.363 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.363 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.363 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.363 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.364 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.364 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.364 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.364 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.364 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.364 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.365 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.365 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.365 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.365 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.365 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.365 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.365 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.366 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.366 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.366 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.366 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.366 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.366 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.366 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.366 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.367 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.367 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.367 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.367 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.367 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.367 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.367 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.368 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.368 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.368 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.368 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.368 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.368 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.368 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.369 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.369 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.369 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.369 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.369 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.369 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.369 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.370 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.370 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.370 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.370 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.370 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.370 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.370 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.371 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.371 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.371 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.371 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.371 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.371 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.371 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.372 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.372 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.372 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.372 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.372 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.372 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.372 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.372 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.373 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.373 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.373 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.373 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.373 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.374 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.374 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.374 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.374 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.374 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.374 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.374 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.375 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.375 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.375 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.375 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.375 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.375 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.376 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.376 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.376 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.376 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.376 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.376 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.377 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.377 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.377 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.377 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.377 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.377 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.377 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.378 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.378 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.378 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.378 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.378 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.378 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.378 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.379 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.379 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.379 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.379 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.379 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.379 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.379 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.380 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.380 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.380 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.380 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.380 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.380 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.380 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.381 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.381 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.381 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.381 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.381 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.381 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.381 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.382 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.382 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.382 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.382 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.382 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.382 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.382 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.383 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.383 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.383 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.383 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.383 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.383 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.383 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.384 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.384 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.384 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.384 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.384 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.384 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.384 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.385 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.385 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.385 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.385 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.385 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.385 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.385 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.386 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.386 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.386 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.386 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.386 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.386 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.386 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.387 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.387 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.387 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.387 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.387 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.387 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.387 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.388 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.388 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.388 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.388 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.388 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.389 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.389 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.389 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.389 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.389 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.389 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.389 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.390 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.390 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.390 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.390 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.390 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.390 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.390 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.391 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.391 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.391 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.391 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.391 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.391 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.391 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.392 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.392 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.392 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.392 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.392 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.392 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.392 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.392 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.393 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.393 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.393 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.393 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.393 186844 WARNING oslo_config.cfg [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 27 17:07:03 compute-0 nova_compute[186840]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 27 17:07:03 compute-0 nova_compute[186840]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 27 17:07:03 compute-0 nova_compute[186840]: and ``live_migration_inbound_addr`` respectively.
Feb 27 17:07:03 compute-0 nova_compute[186840]: ).  Its value may be silently ignored in the future.
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.394 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.394 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.394 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.394 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.394 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.394 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.395 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.395 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.395 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.395 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.395 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.395 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.395 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.396 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.396 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.396 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.396 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.396 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.396 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.396 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.396 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.397 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.397 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.397 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.397 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.397 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.397 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.397 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.398 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.398 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.398 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.398 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.398 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.398 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.399 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.399 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.399 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.399 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.399 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.399 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.399 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.400 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.400 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.400 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.400 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.400 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.400 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.400 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.401 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.401 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.401 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.401 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.401 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.401 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.401 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.402 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.402 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.402 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.402 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.402 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.402 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.402 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.403 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.403 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.403 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.403 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.403 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.403 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.403 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.404 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.404 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.404 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.404 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.404 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.404 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.404 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.405 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.405 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.405 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.405 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.405 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.405 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.405 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.406 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.406 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.406 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.406 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.406 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.406 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.406 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.407 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.407 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.407 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.407 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.407 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.407 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.408 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.408 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.408 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.408 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.408 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.408 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.408 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.408 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.409 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.409 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.409 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.409 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.409 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.409 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.409 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.410 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.410 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.410 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.410 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.410 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.410 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.410 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.411 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.411 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.411 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.411 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.411 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.411 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.411 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.412 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.412 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.412 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.412 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.412 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.412 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.412 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.413 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.413 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.413 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.413 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.413 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.413 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.413 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.414 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.414 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.414 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.414 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.414 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.414 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.415 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.415 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.415 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.415 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.415 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.415 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.415 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.416 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.416 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.416 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.416 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.416 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.416 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.416 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.417 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.417 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.417 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.417 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.417 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.417 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.417 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.418 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.418 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.418 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.418 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.418 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.418 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.418 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.419 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.419 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.419 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.419 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.419 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.419 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.419 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.420 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.420 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.420 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.420 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.420 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.420 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.420 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.421 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.421 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.421 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.421 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.421 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.421 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.421 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.422 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.422 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.422 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.422 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.422 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.423 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.423 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.423 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.423 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.423 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.424 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.424 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.424 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.424 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.424 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.424 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.424 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.425 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.425 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.425 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.425 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.425 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.425 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.425 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.426 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.426 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.426 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.426 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.426 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.426 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.426 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.427 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.427 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.427 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.427 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.427 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.427 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.427 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.428 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.428 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.428 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.428 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.428 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.429 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.429 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.429 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.429 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.429 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.430 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.430 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.430 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.430 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.430 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.431 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.431 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.431 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.431 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.432 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.432 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.432 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.432 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.433 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.433 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.433 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.433 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.434 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.434 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.434 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.434 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.435 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.435 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.435 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.435 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.435 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.435 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.436 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.436 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.436 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.436 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.436 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.437 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.437 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.437 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.437 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.437 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.438 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.438 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.438 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.438 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.439 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.439 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.439 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.440 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.440 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.440 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.440 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.441 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.441 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.441 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.441 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.442 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.442 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.442 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.442 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.442 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.443 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.443 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.443 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.443 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.443 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.444 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.444 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.444 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.444 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.444 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.445 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.445 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.445 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.445 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.445 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.446 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.446 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.446 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.446 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.446 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.447 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.447 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.447 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.447 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.447 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.448 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.448 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.448 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.448 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.448 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.448 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.449 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.449 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.449 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.449 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.450 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.450 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.450 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.450 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.450 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.451 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.451 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.451 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.451 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.451 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.451 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.452 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.452 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.452 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.452 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.452 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.453 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.453 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.453 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.453 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.454 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.454 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.454 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.454 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.454 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.454 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.455 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.455 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.455 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.455 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.455 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.456 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.456 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.456 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.456 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.456 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.457 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.457 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.457 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.457 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.457 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.457 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.458 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.458 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.458 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.458 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.459 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.459 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.459 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.459 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.459 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.460 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.460 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.460 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.460 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.461 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.461 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.461 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.461 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.462 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.462 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.462 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.462 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.463 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.463 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.463 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.463 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.464 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.464 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.464 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.464 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.465 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.465 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.465 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.465 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.465 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.465 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.466 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.466 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.466 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.466 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.466 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.467 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.467 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.467 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.467 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.467 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.467 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.468 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.468 186844 DEBUG oslo_service.service [None req-dcf48a4a-0965-4bc1-b260-ba247e631954 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.469 186844 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260220085704.5cfeecb.el9)
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.490 186844 INFO nova.virt.node [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Determined node identity 2b4df47a-58ba-41db-b94b-eb594c2f9699 from /var/lib/nova/compute_id
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.491 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.492 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.492 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.493 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.507 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f9bad529400> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.510 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f9bad529400> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.511 186844 INFO nova.virt.libvirt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Connection event '1' reason 'None'
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.518 186844 INFO nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Libvirt host capabilities <capabilities>
Feb 27 17:07:03 compute-0 nova_compute[186840]: 
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <host>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <uuid>1b296e36-37ac-4d9b-9bcd-e79d835197c3</uuid>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <cpu>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <arch>x86_64</arch>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model>EPYC-Rome-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <vendor>AMD</vendor>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <microcode version='16777317'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <signature family='23' model='49' stepping='0'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='x2apic'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='tsc-deadline'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='osxsave'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='hypervisor'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='tsc_adjust'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='spec-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='stibp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='arch-capabilities'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='cmp_legacy'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='topoext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='virt-ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='lbrv'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='tsc-scale'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='vmcb-clean'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='pause-filter'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='pfthreshold'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='svme-addr-chk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='rdctl-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='skip-l1dfl-vmentry'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='mds-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature name='pschange-mc-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <pages unit='KiB' size='4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <pages unit='KiB' size='2048'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <pages unit='KiB' size='1048576'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </cpu>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <power_management>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <suspend_mem/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <suspend_disk/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <suspend_hybrid/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </power_management>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <iommu support='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <migration_features>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <live/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <uri_transports>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <uri_transport>tcp</uri_transport>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <uri_transport>rdma</uri_transport>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </uri_transports>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </migration_features>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <topology>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <cells num='1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <cell id='0'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:           <memory unit='KiB'>7864276</memory>
Feb 27 17:07:03 compute-0 nova_compute[186840]:           <pages unit='KiB' size='4'>1966069</pages>
Feb 27 17:07:03 compute-0 nova_compute[186840]:           <pages unit='KiB' size='2048'>0</pages>
Feb 27 17:07:03 compute-0 nova_compute[186840]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 27 17:07:03 compute-0 nova_compute[186840]:           <distances>
Feb 27 17:07:03 compute-0 nova_compute[186840]:             <sibling id='0' value='10'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:           </distances>
Feb 27 17:07:03 compute-0 nova_compute[186840]:           <cpus num='8'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:           </cpus>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         </cell>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </cells>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </topology>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <cache>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </cache>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <secmodel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model>selinux</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <doi>0</doi>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </secmodel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <secmodel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model>dac</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <doi>0</doi>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </secmodel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </host>
Feb 27 17:07:03 compute-0 nova_compute[186840]: 
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <guest>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <os_type>hvm</os_type>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <arch name='i686'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <wordsize>32</wordsize>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <domain type='qemu'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <domain type='kvm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </arch>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <features>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <pae/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <nonpae/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <acpi default='on' toggle='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <apic default='on' toggle='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <cpuselection/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <deviceboot/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <disksnapshot default='on' toggle='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <externalSnapshot/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </features>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </guest>
Feb 27 17:07:03 compute-0 nova_compute[186840]: 
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <guest>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <os_type>hvm</os_type>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <arch name='x86_64'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <wordsize>64</wordsize>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <domain type='qemu'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <domain type='kvm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </arch>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <features>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <acpi default='on' toggle='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <apic default='on' toggle='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <cpuselection/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <deviceboot/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <disksnapshot default='on' toggle='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <externalSnapshot/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </features>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </guest>
Feb 27 17:07:03 compute-0 nova_compute[186840]: 
Feb 27 17:07:03 compute-0 nova_compute[186840]: </capabilities>
Feb 27 17:07:03 compute-0 nova_compute[186840]: 
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.524 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.527 186844 DEBUG nova.virt.libvirt.volume.mount [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.530 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 27 17:07:03 compute-0 nova_compute[186840]: <domainCapabilities>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <path>/usr/libexec/qemu-kvm</path>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <domain>kvm</domain>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <arch>i686</arch>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <vcpu max='4096'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <iothreads supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <os supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <enum name='firmware'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <loader supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>rom</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pflash</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='readonly'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>yes</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>no</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='secure'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>no</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </loader>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </os>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <cpu>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='host-passthrough' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='hostPassthroughMigratable'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>on</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>off</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='maximum' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='maximumMigratable'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>on</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>off</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='host-model' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <vendor>AMD</vendor>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='x2apic'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc-deadline'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='hypervisor'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc_adjust'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='spec-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='stibp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='cmp_legacy'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='overflow-recov'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='succor'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='amd-ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='virt-ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='lbrv'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc-scale'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='vmcb-clean'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='flushbyasid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='pause-filter'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='pfthreshold'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='svme-addr-chk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='disable' name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='custom' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='ClearwaterForest'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ddpd-u'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sha512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm3'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='ClearwaterForest-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ddpd-u'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sha512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm3'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Dhyana-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Turin'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibpb-brtype'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbpb'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Turin-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibpb-brtype'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbpb'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-128'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-256'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-128'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-256'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v6'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v7'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='KnightsMill'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4fmaps'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4vnniw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512er'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512pf'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='KnightsMill-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4fmaps'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4vnniw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512er'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512pf'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G4-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tbm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G5-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tbm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='athlon'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='athlon-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='core2duo'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='core2duo-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='coreduo'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='coreduo-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='n270'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='n270-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='phenom'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='phenom-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <memoryBacking supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <enum name='sourceType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>file</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>anonymous</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>memfd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </memoryBacking>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <disk supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='diskDevice'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>disk</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>cdrom</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>floppy</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>lun</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='bus'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>fdc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>scsi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>sata</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-non-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <graphics supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vnc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>egl-headless</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dbus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <video supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='modelType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vga</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>cirrus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>none</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>bochs</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>ramfb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </video>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <hostdev supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='mode'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>subsystem</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='startupPolicy'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>default</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>mandatory</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>requisite</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>optional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='subsysType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pci</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>scsi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='capsType'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='pciBackend'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </hostdev>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <rng supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-non-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>random</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>egd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>builtin</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <filesystem supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='driverType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>path</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>handle</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtiofs</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </filesystem>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <tpm supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tpm-tis</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tpm-crb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>emulator</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>external</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendVersion'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>2.0</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </tpm>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <redirdev supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='bus'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </redirdev>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <channel supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pty</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>unix</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </channel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <crypto supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>qemu</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>builtin</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </crypto>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <interface supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>default</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>passt</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <panic supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>isa</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>hyperv</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </panic>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <console supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>null</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pty</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dev</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>file</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pipe</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>stdio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>udp</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tcp</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>unix</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>qemu-vdagent</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dbus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </console>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <features>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <gic supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <vmcoreinfo supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <genid supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <backingStoreInput supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <backup supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <async-teardown supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <s390-pv supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <ps2 supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <tdx supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <sev supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <sgx supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <hyperv supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='features'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>relaxed</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vapic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>spinlocks</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vpindex</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>runtime</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>synic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>stimer</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>reset</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vendor_id</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>frequencies</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>reenlightenment</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tlbflush</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>ipi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>avic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>emsr_bitmap</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>xmm_input</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <defaults>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <spinlocks>4095</spinlocks>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <stimer_direct>on</stimer_direct>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <tlbflush_direct>on</tlbflush_direct>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <tlbflush_extended>on</tlbflush_extended>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </defaults>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </hyperv>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <launchSecurity supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </features>
Feb 27 17:07:03 compute-0 nova_compute[186840]: </domainCapabilities>
Feb 27 17:07:03 compute-0 nova_compute[186840]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.541 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 27 17:07:03 compute-0 nova_compute[186840]: <domainCapabilities>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <path>/usr/libexec/qemu-kvm</path>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <domain>kvm</domain>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <arch>i686</arch>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <vcpu max='240'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <iothreads supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <os supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <enum name='firmware'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <loader supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>rom</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pflash</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='readonly'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>yes</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>no</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='secure'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>no</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </loader>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </os>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <cpu>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='host-passthrough' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='hostPassthroughMigratable'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>on</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>off</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='maximum' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='maximumMigratable'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>on</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>off</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='host-model' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <vendor>AMD</vendor>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='x2apic'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc-deadline'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='hypervisor'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc_adjust'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='spec-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='stibp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='cmp_legacy'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='overflow-recov'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='succor'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='amd-ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='virt-ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='lbrv'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc-scale'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='vmcb-clean'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='flushbyasid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='pause-filter'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='pfthreshold'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='svme-addr-chk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='disable' name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='custom' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='ClearwaterForest'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ddpd-u'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sha512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm3'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='ClearwaterForest-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ddpd-u'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sha512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm3'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Dhyana-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Turin'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibpb-brtype'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbpb'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Turin-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibpb-brtype'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbpb'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-128'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-256'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-128'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-256'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v6'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v7'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='KnightsMill'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4fmaps'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4vnniw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512er'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512pf'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='KnightsMill-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4fmaps'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4vnniw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512er'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512pf'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G4-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tbm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G5-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tbm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='athlon'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='athlon-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='core2duo'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='core2duo-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='coreduo'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='coreduo-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='n270'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='n270-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='phenom'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='phenom-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <memoryBacking supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <enum name='sourceType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>file</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>anonymous</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>memfd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </memoryBacking>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <disk supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='diskDevice'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>disk</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>cdrom</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>floppy</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>lun</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='bus'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>ide</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>fdc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>scsi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>sata</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-non-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <graphics supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vnc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>egl-headless</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dbus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <video supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='modelType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vga</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>cirrus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>none</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>bochs</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>ramfb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </video>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <hostdev supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='mode'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>subsystem</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='startupPolicy'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>default</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>mandatory</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>requisite</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>optional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='subsysType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pci</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>scsi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='capsType'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='pciBackend'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </hostdev>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <rng supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-non-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>random</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>egd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>builtin</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <filesystem supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='driverType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>path</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>handle</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtiofs</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </filesystem>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <tpm supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tpm-tis</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tpm-crb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>emulator</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>external</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendVersion'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>2.0</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </tpm>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <redirdev supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='bus'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </redirdev>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <channel supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pty</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>unix</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </channel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <crypto supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>qemu</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>builtin</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </crypto>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <interface supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>default</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>passt</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <panic supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>isa</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>hyperv</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </panic>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <console supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>null</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pty</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dev</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>file</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pipe</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>stdio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>udp</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tcp</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>unix</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>qemu-vdagent</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dbus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </console>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <features>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <gic supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <vmcoreinfo supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <genid supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <backingStoreInput supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <backup supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <async-teardown supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <s390-pv supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <ps2 supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <tdx supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <sev supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <sgx supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <hyperv supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='features'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>relaxed</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vapic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>spinlocks</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vpindex</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>runtime</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>synic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>stimer</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>reset</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vendor_id</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>frequencies</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>reenlightenment</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tlbflush</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>ipi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>avic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>emsr_bitmap</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>xmm_input</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <defaults>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <spinlocks>4095</spinlocks>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <stimer_direct>on</stimer_direct>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <tlbflush_direct>on</tlbflush_direct>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <tlbflush_extended>on</tlbflush_extended>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </defaults>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </hyperv>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <launchSecurity supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </features>
Feb 27 17:07:03 compute-0 nova_compute[186840]: </domainCapabilities>
Feb 27 17:07:03 compute-0 nova_compute[186840]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.599 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.605 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 27 17:07:03 compute-0 nova_compute[186840]: <domainCapabilities>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <path>/usr/libexec/qemu-kvm</path>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <domain>kvm</domain>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <arch>x86_64</arch>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <vcpu max='4096'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <iothreads supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <os supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <enum name='firmware'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>efi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <loader supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>rom</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pflash</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='readonly'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>yes</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>no</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='secure'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>yes</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>no</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </loader>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </os>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <cpu>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='host-passthrough' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='hostPassthroughMigratable'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>on</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>off</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='maximum' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='maximumMigratable'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>on</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>off</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='host-model' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <vendor>AMD</vendor>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='x2apic'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc-deadline'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='hypervisor'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc_adjust'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='spec-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='stibp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='cmp_legacy'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='overflow-recov'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='succor'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='amd-ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='virt-ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='lbrv'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc-scale'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='vmcb-clean'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='flushbyasid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='pause-filter'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='pfthreshold'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='svme-addr-chk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='disable' name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='custom' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='ClearwaterForest'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ddpd-u'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sha512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm3'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='ClearwaterForest-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ddpd-u'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sha512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm3'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Dhyana-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Turin'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibpb-brtype'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbpb'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Turin-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibpb-brtype'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbpb'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-128'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-256'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-128'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-256'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v6'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v7'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='KnightsMill'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4fmaps'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4vnniw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512er'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512pf'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='KnightsMill-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4fmaps'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4vnniw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512er'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512pf'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G4-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tbm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G5-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tbm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='athlon'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='athlon-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='core2duo'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='core2duo-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='coreduo'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='coreduo-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='n270'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='n270-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='phenom'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='phenom-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <memoryBacking supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <enum name='sourceType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>file</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>anonymous</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>memfd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </memoryBacking>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <disk supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='diskDevice'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>disk</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>cdrom</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>floppy</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>lun</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='bus'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>fdc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>scsi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>sata</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-non-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <graphics supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vnc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>egl-headless</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dbus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <video supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='modelType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vga</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>cirrus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>none</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>bochs</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>ramfb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </video>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <hostdev supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='mode'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>subsystem</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='startupPolicy'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>default</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>mandatory</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>requisite</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>optional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='subsysType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pci</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>scsi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='capsType'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='pciBackend'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </hostdev>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <rng supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-non-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>random</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>egd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>builtin</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <filesystem supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='driverType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>path</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>handle</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtiofs</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </filesystem>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <tpm supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tpm-tis</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tpm-crb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>emulator</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>external</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendVersion'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>2.0</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </tpm>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <redirdev supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='bus'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </redirdev>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <channel supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pty</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>unix</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </channel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <crypto supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>qemu</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>builtin</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </crypto>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <interface supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>default</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>passt</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <panic supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>isa</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>hyperv</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </panic>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <console supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>null</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pty</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dev</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>file</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pipe</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>stdio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>udp</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tcp</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>unix</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>qemu-vdagent</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dbus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </console>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <features>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <gic supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <vmcoreinfo supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <genid supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <backingStoreInput supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <backup supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <async-teardown supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <s390-pv supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <ps2 supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <tdx supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <sev supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <sgx supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <hyperv supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='features'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>relaxed</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vapic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>spinlocks</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vpindex</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>runtime</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>synic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>stimer</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>reset</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vendor_id</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>frequencies</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>reenlightenment</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tlbflush</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>ipi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>avic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>emsr_bitmap</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>xmm_input</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <defaults>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <spinlocks>4095</spinlocks>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <stimer_direct>on</stimer_direct>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <tlbflush_direct>on</tlbflush_direct>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <tlbflush_extended>on</tlbflush_extended>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </defaults>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </hyperv>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <launchSecurity supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </features>
Feb 27 17:07:03 compute-0 nova_compute[186840]: </domainCapabilities>
Feb 27 17:07:03 compute-0 nova_compute[186840]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.676 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 27 17:07:03 compute-0 nova_compute[186840]: <domainCapabilities>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <path>/usr/libexec/qemu-kvm</path>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <domain>kvm</domain>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <arch>x86_64</arch>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <vcpu max='240'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <iothreads supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <os supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <enum name='firmware'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <loader supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>rom</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pflash</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='readonly'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>yes</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>no</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='secure'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>no</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </loader>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </os>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <cpu>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='host-passthrough' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='hostPassthroughMigratable'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>on</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>off</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='maximum' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='maximumMigratable'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>on</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>off</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='host-model' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <vendor>AMD</vendor>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='x2apic'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc-deadline'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='hypervisor'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc_adjust'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='spec-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='stibp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='cmp_legacy'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='overflow-recov'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='succor'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='amd-ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='virt-ssbd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='lbrv'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='tsc-scale'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='vmcb-clean'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='flushbyasid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='pause-filter'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='pfthreshold'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='svme-addr-chk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <feature policy='disable' name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <mode name='custom' supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Broadwell-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cascadelake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='ClearwaterForest'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ddpd-u'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sha512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm3'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='ClearwaterForest-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ddpd-u'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sha512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm3'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sm4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Cooperlake-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Denverton-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Dhyana-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Genoa-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Milan-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Rome-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Turin'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibpb-brtype'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbpb'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-Turin-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amd-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='auto-ibrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vp2intersect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fs-gs-base-ns'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibpb-brtype'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='no-nested-data-bp'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='null-sel-clr-base'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='perfmon-v2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbpb'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='srso-user-kernel-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='stibp-always-on'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='EPYC-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-128'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-256'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='GraniteRapids-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-128'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-256'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx10-512'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='prefetchiti'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Haswell-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-noTSX'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v6'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Icelake-Server-v7'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='IvyBridge-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='KnightsMill'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4fmaps'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4vnniw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512er'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512pf'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='KnightsMill-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4fmaps'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-4vnniw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512er'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512pf'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G4-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tbm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Opteron_G5-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fma4'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tbm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xop'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SapphireRapids-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='amx-tile'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-bf16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-fp16'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512-vpopcntdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bitalg'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vbmi2'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrc'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fzrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='la57'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='taa-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='tsx-ldtrk'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='SierraForest-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ifma'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-ne-convert'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx-vnni-int8'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bhi-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='bus-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cmpccxadd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fbsdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='fsrs'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ibrs-all'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='intel-psfd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ipred-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='lam'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mcdt-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pbrsb-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='psdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rrsba-ctrl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='sbdr-ssdp-no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='serialize'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vaes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='vpclmulqdq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Client-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='hle'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='rtm'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Skylake-Server-v5'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512bw'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512cd'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512dq'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512f'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='avx512vl'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='invpcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pcid'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='pku'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='mpx'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v2'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v3'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='core-capability'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='split-lock-detect'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='Snowridge-v4'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='cldemote'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='erms'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='gfni'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdir64b'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='movdiri'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='xsaves'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='athlon'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='athlon-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='core2duo'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='core2duo-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='coreduo'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='coreduo-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='n270'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='n270-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='ss'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='phenom'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <blockers model='phenom-v1'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnow'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <feature name='3dnowext'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </blockers>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </mode>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <memoryBacking supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <enum name='sourceType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>file</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>anonymous</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <value>memfd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </memoryBacking>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <disk supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='diskDevice'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>disk</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>cdrom</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>floppy</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>lun</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='bus'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>ide</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>fdc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>scsi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>sata</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-non-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <graphics supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vnc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>egl-headless</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dbus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <video supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='modelType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vga</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>cirrus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>none</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>bochs</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>ramfb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </video>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <hostdev supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='mode'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>subsystem</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='startupPolicy'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>default</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>mandatory</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>requisite</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>optional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='subsysType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pci</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>scsi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='capsType'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='pciBackend'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </hostdev>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <rng supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtio-non-transitional</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>random</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>egd</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>builtin</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <filesystem supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='driverType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>path</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>handle</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>virtiofs</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </filesystem>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <tpm supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tpm-tis</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tpm-crb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>emulator</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>external</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendVersion'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>2.0</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </tpm>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <redirdev supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='bus'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>usb</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </redirdev>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <channel supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pty</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>unix</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </channel>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <crypto supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>qemu</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendModel'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>builtin</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </crypto>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <interface supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='backendType'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>default</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>passt</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <panic supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='model'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>isa</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>hyperv</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </panic>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <console supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='type'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>null</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vc</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pty</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dev</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>file</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>pipe</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>stdio</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>udp</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tcp</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>unix</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>qemu-vdagent</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>dbus</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </console>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   <features>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <gic supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <vmcoreinfo supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <genid supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <backingStoreInput supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <backup supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <async-teardown supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <s390-pv supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <ps2 supported='yes'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <tdx supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <sev supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <sgx supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <hyperv supported='yes'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <enum name='features'>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>relaxed</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vapic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>spinlocks</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vpindex</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>runtime</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>synic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>stimer</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>reset</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>vendor_id</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>frequencies</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>reenlightenment</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>tlbflush</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>ipi</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>avic</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>emsr_bitmap</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <value>xmm_input</value>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </enum>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       <defaults>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <spinlocks>4095</spinlocks>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <stimer_direct>on</stimer_direct>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <tlbflush_direct>on</tlbflush_direct>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <tlbflush_extended>on</tlbflush_extended>
Feb 27 17:07:03 compute-0 nova_compute[186840]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 27 17:07:03 compute-0 nova_compute[186840]:       </defaults>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     </hyperv>
Feb 27 17:07:03 compute-0 nova_compute[186840]:     <launchSecurity supported='no'/>
Feb 27 17:07:03 compute-0 nova_compute[186840]:   </features>
Feb 27 17:07:03 compute-0 nova_compute[186840]: </domainCapabilities>
Feb 27 17:07:03 compute-0 nova_compute[186840]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.739 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.740 186844 INFO nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Secure Boot support detected
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.742 186844 INFO nova.virt.libvirt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.742 186844 INFO nova.virt.libvirt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.751 186844 DEBUG nova.virt.libvirt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.775 186844 INFO nova.virt.node [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Determined node identity 2b4df47a-58ba-41db-b94b-eb594c2f9699 from /var/lib/nova/compute_id
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.794 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Verified node 2b4df47a-58ba-41db-b94b-eb594c2f9699 matches my host compute-0.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.819 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.913 186844 DEBUG oslo_concurrency.lockutils [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.913 186844 DEBUG oslo_concurrency.lockutils [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.913 186844 DEBUG oslo_concurrency.lockutils [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:07:03 compute-0 nova_compute[186840]: 2026-02-27 17:07:03.913 186844 DEBUG nova.compute.resource_tracker [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.050 186844 WARNING nova.virt.libvirt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.051 186844 DEBUG nova.compute.resource_tracker [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6150MB free_disk=73.42438507080078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.051 186844 DEBUG oslo_concurrency.lockutils [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.051 186844 DEBUG oslo_concurrency.lockutils [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.243 186844 DEBUG nova.compute.resource_tracker [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.244 186844 DEBUG nova.compute.resource_tracker [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.335 186844 DEBUG nova.scheduler.client.report [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Refreshing inventories for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.363 186844 DEBUG nova.scheduler.client.report [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Updating ProviderTree inventory for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.363 186844 DEBUG nova.compute.provider_tree [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Updating inventory in ProviderTree for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.401 186844 DEBUG nova.scheduler.client.report [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Refreshing aggregate associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.423 186844 DEBUG nova.scheduler.client.report [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Refreshing trait associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, traits: HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AESNI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_BMI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.440 186844 DEBUG nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 27 17:07:04 compute-0 nova_compute[186840]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.440 186844 INFO nova.virt.libvirt.host [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] kernel doesn't support AMD SEV
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.441 186844 DEBUG nova.compute.provider_tree [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.441 186844 DEBUG nova.virt.libvirt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.463 186844 DEBUG nova.scheduler.client.report [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.482 186844 DEBUG nova.compute.resource_tracker [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.482 186844 DEBUG oslo_concurrency.lockutils [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.430s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.482 186844 DEBUG nova.service [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.518 186844 DEBUG nova.service [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 27 17:07:04 compute-0 nova_compute[186840]: 2026-02-27 17:07:04.518 186844 DEBUG nova.servicegroup.drivers.db [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 27 17:07:08 compute-0 sshd-session[187139]: Accepted publickey for zuul from 192.168.122.30 port 42900 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 17:07:08 compute-0 systemd-logind[803]: New session 25 of user zuul.
Feb 27 17:07:08 compute-0 systemd[1]: Started Session 25 of User zuul.
Feb 27 17:07:08 compute-0 sshd-session[187139]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 17:07:09 compute-0 python3.9[187292]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 27 17:07:10 compute-0 sudo[187446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eymfjpuwnjixkqbufouimsulqtefgouj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212029.902301-31-47639175918301/AnsiballZ_systemd_service.py'
Feb 27 17:07:10 compute-0 sudo[187446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:10 compute-0 python3.9[187449]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 17:07:10 compute-0 systemd[1]: Reloading.
Feb 27 17:07:10 compute-0 systemd-rc-local-generator[187478]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:07:10 compute-0 systemd-sysv-generator[187481]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:07:11 compute-0 sudo[187446]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:12 compute-0 python3.9[187642]: ansible-ansible.builtin.service_facts Invoked
Feb 27 17:07:12 compute-0 network[187659]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 27 17:07:12 compute-0 network[187660]: 'network-scripts' will be removed from distribution in near future.
Feb 27 17:07:12 compute-0 network[187661]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 27 17:07:17 compute-0 sudo[187932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcuvkcdsfigecpbbcsymxjyozfclnbbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212037.2957578-50-22632844537762/AnsiballZ_systemd_service.py'
Feb 27 17:07:17 compute-0 sudo[187932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:17 compute-0 python3.9[187935]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:07:17 compute-0 sudo[187932]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:18 compute-0 sudo[188086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyzscnalmfmrizkgenzrlvedjkgbasxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212038.2566745-60-132081580068213/AnsiballZ_file.py'
Feb 27 17:07:18 compute-0 sudo[188086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:18 compute-0 python3.9[188089]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:18 compute-0 sudo[188086]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:18 compute-0 rsyslogd[1012]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 27 17:07:18 compute-0 rsyslogd[1012]: imjournal from <np0005633116:sudo>: begin to drop messages due to rate-limiting
Feb 27 17:07:18 compute-0 rsyslogd[1012]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 27 17:07:19 compute-0 sudo[188240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgcfmjhpjncylrfupdgcezvahxvkjjvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212039.043303-68-84595730483926/AnsiballZ_file.py'
Feb 27 17:07:19 compute-0 sudo[188240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:19 compute-0 python3.9[188243]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:19 compute-0 sudo[188240]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:20 compute-0 sudo[188393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-umbasxomgqrlirjtdlwxqanzjubnypkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212039.780603-77-176829610156830/AnsiballZ_command.py'
Feb 27 17:07:20 compute-0 sudo[188393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:20 compute-0 python3.9[188396]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:07:20 compute-0 sudo[188393]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:21 compute-0 python3.9[188548]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 27 17:07:21 compute-0 sudo[188698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrkjqtbpqkzvspculsgfvngsxpsrmahr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212041.6497643-95-172964470026388/AnsiballZ_systemd_service.py'
Feb 27 17:07:21 compute-0 sudo[188698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:22 compute-0 python3.9[188701]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 17:07:22 compute-0 systemd[1]: Reloading.
Feb 27 17:07:22 compute-0 systemd-sysv-generator[188732]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:07:22 compute-0 systemd-rc-local-generator[188726]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:07:22 compute-0 sudo[188698]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:22 compute-0 podman[188744]: 2026-02-27 17:07:22.642318974 +0000 UTC m=+0.077833578 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 27 17:07:23 compute-0 sudo[188912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfwnfcbhxwhlwcflvgvsqjrrcgoiavup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212042.8047483-103-150502713087892/AnsiballZ_command.py'
Feb 27 17:07:23 compute-0 sudo[188912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:23 compute-0 python3.9[188915]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:07:23 compute-0 sudo[188912]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:23 compute-0 sudo[189066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmzkkuzeqqvaxnexazwuxqerppsburbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212043.5693226-112-54087067738216/AnsiballZ_file.py'
Feb 27 17:07:23 compute-0 sudo[189066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:24 compute-0 python3.9[189069]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:07:24 compute-0 sudo[189066]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:24 compute-0 python3.9[189219]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:07:25 compute-0 sudo[189384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyhusaqnivaigxlydtokyyktrfpyfldw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212045.0844479-128-68428516962778/AnsiballZ_group.py'
Feb 27 17:07:25 compute-0 sudo[189384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:25 compute-0 podman[189345]: 2026-02-27 17:07:25.576072517 +0000 UTC m=+0.085573508 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Feb 27 17:07:25 compute-0 python3.9[189397]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Feb 27 17:07:25 compute-0 sudo[189384]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:26 compute-0 sudo[189550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwgxsektrufatuczngjloldfzcqdwolt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212046.2998335-139-214359935675568/AnsiballZ_getent.py'
Feb 27 17:07:26 compute-0 sudo[189550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:26 compute-0 python3.9[189553]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Feb 27 17:07:27 compute-0 sudo[189550]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:27 compute-0 sudo[189704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpwhmuqgiezjulpojyeisypnjzdhcegg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212047.2479627-147-90917504085556/AnsiballZ_group.py'
Feb 27 17:07:27 compute-0 sudo[189704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:27 compute-0 python3.9[189707]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 27 17:07:27 compute-0 groupadd[189708]: group added to /etc/group: name=ceilometer, GID=42405
Feb 27 17:07:27 compute-0 groupadd[189708]: group added to /etc/gshadow: name=ceilometer
Feb 27 17:07:27 compute-0 groupadd[189708]: new group: name=ceilometer, GID=42405
Feb 27 17:07:27 compute-0 sudo[189704]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:28 compute-0 sudo[189863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sulvwabfmtaobmwsgiapprggflbfejnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212048.2370794-155-104749047753687/AnsiballZ_user.py'
Feb 27 17:07:28 compute-0 sudo[189863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:28 compute-0 python3.9[189866]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 27 17:07:29 compute-0 useradd[189868]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Feb 27 17:07:29 compute-0 useradd[189868]: add 'ceilometer' to group 'libvirt'
Feb 27 17:07:29 compute-0 useradd[189868]: add 'ceilometer' to shadow group 'libvirt'
Feb 27 17:07:30 compute-0 sudo[189863]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:31 compute-0 python3.9[190024]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:32 compute-0 python3.9[190145]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1772212051.297054-181-244531522563049/.source.conf _original_basename=ceilometer.conf follow=False checksum=5c6a9288d15d1b05b1484826ce363ad306e9930c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:33 compute-0 python3.9[190295]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:33 compute-0 python3.9[190416]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1772212052.6856704-181-52783666968585/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:34 compute-0 python3.9[190566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:34 compute-0 python3.9[190687]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1772212053.784977-181-222487101227594/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:35 compute-0 python3.9[190837]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:07:36 compute-0 python3.9[190989]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:07:36 compute-0 nova_compute[186840]: 2026-02-27 17:07:36.520 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:07:36 compute-0 nova_compute[186840]: 2026-02-27 17:07:36.548 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:07:36 compute-0 python3.9[191141]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:37 compute-0 python3.9[191262]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772212056.3285334-240-101228658073343/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:07:37 compute-0 python3.9[191412]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:38 compute-0 python3.9[191533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772212057.535598-240-129140618844046/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=87dede51a10e22722618c1900db75cb764463d91 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:07:39 compute-0 python3.9[191683]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:39 compute-0 python3.9[191804]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772212058.74728-269-221470608095939/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:07:40 compute-0 python3.9[191954]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:41 compute-0 python3.9[192075]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1772212060.1624503-285-193483778041816/.source.yaml _original_basename=node_exporter.yaml follow=False checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:42 compute-0 python3.9[192225]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:42 compute-0 python3.9[192346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1772212061.5392315-300-67774744136444/.source.yaml _original_basename=podman_exporter.yaml follow=False checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:43 compute-0 python3.9[192496]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:43 compute-0 python3.9[192617]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1772212062.8393776-315-257204265846776/.source.yaml _original_basename=ceilometer_prom_exporter.yaml follow=False checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:44 compute-0 sudo[192767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpqzhnqynstehiubvrvvpyvkuwollzbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212064.0643876-330-90500103522641/AnsiballZ_file.py'
Feb 27 17:07:44 compute-0 sudo[192767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:44 compute-0 python3.9[192770]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:44 compute-0 sudo[192767]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:44 compute-0 sudo[192920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijyptkaybhlaypjklpwbjdwrvslnnsgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212064.7281852-338-266004489140751/AnsiballZ_file.py'
Feb 27 17:07:44 compute-0 sudo[192920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:45 compute-0 python3.9[192923]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:45 compute-0 sudo[192920]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:45 compute-0 python3.9[193073]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:07:46 compute-0 python3.9[193225]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:07:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:07:47.080 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:07:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:07:47.080 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:07:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:07:47.080 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:07:47 compute-0 python3.9[193377]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:07:47 compute-0 sudo[193529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uegteteslnqcxlhysqtzngwxhqgasnky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212067.4733942-370-38629118279240/AnsiballZ_file.py'
Feb 27 17:07:47 compute-0 sudo[193529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:47 compute-0 python3.9[193532]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:07:47 compute-0 sudo[193529]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:48 compute-0 sudo[193682]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmcsvaxmhliqbebxdwmwonlscfvnwhaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212068.1133397-378-229601875600788/AnsiballZ_systemd_service.py'
Feb 27 17:07:48 compute-0 sudo[193682]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:48 compute-0 python3.9[193685]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:07:48 compute-0 systemd[1]: Reloading.
Feb 27 17:07:48 compute-0 systemd-rc-local-generator[193709]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:07:48 compute-0 systemd-sysv-generator[193714]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:07:49 compute-0 systemd[1]: Listening on Podman API Socket.
Feb 27 17:07:49 compute-0 sudo[193682]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:49 compute-0 sudo[193881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvphkinyllafdcymevkzeltlkfuxctti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212069.3931975-387-254019575811688/AnsiballZ_stat.py'
Feb 27 17:07:49 compute-0 sudo[193881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:49 compute-0 python3.9[193884]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:49 compute-0 sudo[193881]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:50 compute-0 sudo[194005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npuqhryyruvxbqgtodsmreggisalkqkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212069.3931975-387-254019575811688/AnsiballZ_copy.py'
Feb 27 17:07:50 compute-0 sudo[194005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:50 compute-0 python3.9[194008]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772212069.3931975-387-254019575811688/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:07:50 compute-0 sudo[194005]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:50 compute-0 sudo[194082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxeeohpeluddljwqfzcmqqmcythcemuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212069.3931975-387-254019575811688/AnsiballZ_stat.py'
Feb 27 17:07:50 compute-0 sudo[194082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:50 compute-0 python3.9[194085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:50 compute-0 sudo[194082]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:51 compute-0 sudo[194206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhgmjeivdfljiyndpsztdmhsvrkerhgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212069.3931975-387-254019575811688/AnsiballZ_copy.py'
Feb 27 17:07:51 compute-0 sudo[194206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:51 compute-0 python3.9[194209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772212069.3931975-387-254019575811688/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:07:51 compute-0 sudo[194206]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:52 compute-0 sudo[194359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqtnjelytpoynornxqzltccqdhfrpksm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212072.102016-419-142141091583463/AnsiballZ_file.py'
Feb 27 17:07:52 compute-0 sudo[194359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:52 compute-0 python3.9[194362]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:52 compute-0 sudo[194359]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:53 compute-0 sudo[194524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgzunekjvrjowpjtblvvfwhyewrvwxga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212072.7708235-427-232501257373425/AnsiballZ_file.py'
Feb 27 17:07:53 compute-0 sudo[194524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:53 compute-0 podman[194486]: 2026-02-27 17:07:53.087116605 +0000 UTC m=+0.080814201 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:07:53 compute-0 python3.9[194531]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:07:53 compute-0 sudo[194524]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:53 compute-0 sudo[194683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtzyawogspjdulhskmjacxeknnqavqbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212073.4447887-435-249635932039951/AnsiballZ_stat.py'
Feb 27 17:07:53 compute-0 sudo[194683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:53 compute-0 python3.9[194686]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:07:53 compute-0 sudo[194683]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:54 compute-0 sudo[194807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgmqveejvhxxcxkcjchjafrebsnkhjdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212073.4447887-435-249635932039951/AnsiballZ_copy.py'
Feb 27 17:07:54 compute-0 sudo[194807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:54 compute-0 python3.9[194810]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1772212073.4447887-435-249635932039951/.source.json _original_basename=.4npml3vs follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:54 compute-0 sudo[194807]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:55 compute-0 python3.9[194960]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:07:56 compute-0 podman[195205]: 2026-02-27 17:07:56.078879587 +0000 UTC m=+0.070464056 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260223, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 27 17:07:57 compute-0 sudo[195407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iayaswgygydixfswaqodirpwwjqhohuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212076.5915549-475-63954873560206/AnsiballZ_container_config_data.py'
Feb 27 17:07:57 compute-0 sudo[195407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:57 compute-0 python3.9[195410]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Feb 27 17:07:57 compute-0 sudo[195407]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:57 compute-0 sudo[195560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqukftgpkosobfyesdzmisnkikitinfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212077.6334105-486-213080181725322/AnsiballZ_container_config_hash.py'
Feb 27 17:07:57 compute-0 sudo[195560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:58 compute-0 python3.9[195563]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 27 17:07:58 compute-0 sudo[195560]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:58 compute-0 sudo[195713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzaykttihoicpilayabobxhkcermmuew ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772212078.4610698-496-170752819297163/AnsiballZ_edpm_container_manage.py'
Feb 27 17:07:58 compute-0 sudo[195713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:07:59 compute-0 python3[195716]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 27 17:07:59 compute-0 podman[195746]: 2026-02-27 17:07:59.332831976 +0000 UTC m=+0.039727639 container create b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260223, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 27 17:07:59 compute-0 podman[195746]: 2026-02-27 17:07:59.309480441 +0000 UTC m=+0.016376124 image pull 3afa173a26fa8128cbac14b6c3e676d8aa6fde1ace8c482832813e31985446eb quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 27 17:07:59 compute-0 python3[195716]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Feb 27 17:07:59 compute-0 sudo[195713]: pam_unix(sudo:session): session closed for user root
Feb 27 17:07:59 compute-0 sudo[195932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eennaaxgrezpqxrjmfttwrrljflcypir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212079.6584585-504-20853180068546/AnsiballZ_stat.py'
Feb 27 17:07:59 compute-0 sudo[195932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:00 compute-0 python3.9[195935]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:08:00 compute-0 sudo[195932]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:00 compute-0 sudo[196087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhjgvvoapivmupbdzuyrbgzppjslfyuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212080.351696-513-70638256689316/AnsiballZ_file.py'
Feb 27 17:08:00 compute-0 sudo[196087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:00 compute-0 python3.9[196090]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:00 compute-0 sudo[196087]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:01 compute-0 sudo[196164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etsxoygizdvktaidgaiujjymoordmwwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212080.351696-513-70638256689316/AnsiballZ_stat.py'
Feb 27 17:08:01 compute-0 sudo[196164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:01 compute-0 python3.9[196167]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:08:01 compute-0 sudo[196164]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:01 compute-0 sudo[196316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osstfmvrylienlwedlolhluvzwdjauek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212081.2442272-513-18925997719331/AnsiballZ_copy.py'
Feb 27 17:08:01 compute-0 sudo[196316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:01 compute-0 python3.9[196319]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772212081.2442272-513-18925997719331/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:01 compute-0 sudo[196316]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:02 compute-0 sudo[196393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phntzqmnavivzquqxurlcqjwxtsfwxlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212081.2442272-513-18925997719331/AnsiballZ_systemd.py'
Feb 27 17:08:02 compute-0 sudo[196393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.702 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.703 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.704 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.704 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.728 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.729 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.730 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.730 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.731 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.731 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.731 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.732 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.732 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.767 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.767 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.768 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.768 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:08:02 compute-0 python3.9[196396]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 17:08:02 compute-0 systemd[1]: Reloading.
Feb 27 17:08:02 compute-0 systemd-sysv-generator[196426]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:08:02 compute-0 systemd-rc-local-generator[196423]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.937 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.937 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6106MB free_disk=73.42350006103516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.938 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:08:02 compute-0 nova_compute[186840]: 2026-02-27 17:08:02.938 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:08:03 compute-0 nova_compute[186840]: 2026-02-27 17:08:03.023 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:08:03 compute-0 nova_compute[186840]: 2026-02-27 17:08:03.024 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:08:03 compute-0 nova_compute[186840]: 2026-02-27 17:08:03.045 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:08:03 compute-0 sudo[196393]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:03 compute-0 nova_compute[186840]: 2026-02-27 17:08:03.061 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:08:03 compute-0 nova_compute[186840]: 2026-02-27 17:08:03.062 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:08:03 compute-0 nova_compute[186840]: 2026-02-27 17:08:03.063 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:08:03 compute-0 sudo[196513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cshyuplerochewiluplgslrekdtjbipn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212081.2442272-513-18925997719331/AnsiballZ_systemd.py'
Feb 27 17:08:03 compute-0 sudo[196513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:03 compute-0 python3.9[196516]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:08:03 compute-0 systemd[1]: Reloading.
Feb 27 17:08:03 compute-0 systemd-rc-local-generator[196545]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:08:03 compute-0 systemd-sysv-generator[196550]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:08:04 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Feb 27 17:08:04 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:08:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6107765c33b9e174ae9661b11e98d7df27e56a04c718df7bbeef6921e99523eb/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Feb 27 17:08:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6107765c33b9e174ae9661b11e98d7df27e56a04c718df7bbeef6921e99523eb/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 27 17:08:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6107765c33b9e174ae9661b11e98d7df27e56a04c718df7bbeef6921e99523eb/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Feb 27 17:08:04 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6107765c33b9e174ae9661b11e98d7df27e56a04c718df7bbeef6921e99523eb/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Feb 27 17:08:04 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933.
Feb 27 17:08:04 compute-0 podman[196563]: 2026-02-27 17:08:04.195386387 +0000 UTC m=+0.110196525 container init b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: + sudo -E kolla_set_configs
Feb 27 17:08:04 compute-0 sudo[196584]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: sudo: unable to send audit message: Operation not permitted
Feb 27 17:08:04 compute-0 sudo[196584]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Feb 27 17:08:04 compute-0 sudo[196584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 27 17:08:04 compute-0 podman[196563]: 2026-02-27 17:08:04.231742672 +0000 UTC m=+0.146552720 container start b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0)
Feb 27 17:08:04 compute-0 podman[196563]: ceilometer_agent_compute
Feb 27 17:08:04 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Validating config file
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Copying service configuration files
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: INFO:__main__:Writing out command to execute
Feb 27 17:08:04 compute-0 sudo[196513]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:04 compute-0 sudo[196584]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: ++ cat /run_command
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: + ARGS=
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: + sudo kolla_copy_cacerts
Feb 27 17:08:04 compute-0 sudo[196605]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: sudo: unable to send audit message: Operation not permitted
Feb 27 17:08:04 compute-0 podman[196585]: 2026-02-27 17:08:04.292219351 +0000 UTC m=+0.052723179 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223)
Feb 27 17:08:04 compute-0 sudo[196605]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Feb 27 17:08:04 compute-0 sudo[196605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 27 17:08:04 compute-0 systemd[1]: b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933-53385ffc935f8394.service: Main process exited, code=exited, status=1/FAILURE
Feb 27 17:08:04 compute-0 systemd[1]: b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933-53385ffc935f8394.service: Failed with result 'exit-code'.
Feb 27 17:08:04 compute-0 sudo[196605]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: + [[ ! -n '' ]]
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: + . kolla_extend_start
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: + umask 0022
Feb 27 17:08:04 compute-0 ceilometer_agent_compute[196578]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.010 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.010 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.010 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.010 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.011 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.011 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.011 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.011 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.011 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.011 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.011 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.011 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.011 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.011 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.012 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.013 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.014 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.015 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.016 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.017 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.018 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.019 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.019 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.019 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.019 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.019 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.019 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.019 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.019 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.019 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.019 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.020 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.021 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.022 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.023 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.024 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.024 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.024 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.041 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.043 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.044 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Feb 27 17:08:05 compute-0 python3.9[196759]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.155 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.231 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.232 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.232 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.232 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.232 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.232 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.232 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.232 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.232 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.232 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.232 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.233 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.233 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.233 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.233 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.233 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.233 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.233 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.233 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.233 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.233 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.234 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.235 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.235 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.235 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.235 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.235 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.235 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.235 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.235 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.235 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.235 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.235 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.236 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.237 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.238 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.239 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.240 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.241 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.242 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.242 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.242 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.242 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.242 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.242 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.242 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.242 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.242 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.242 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.242 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.243 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.244 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.245 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.246 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.247 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.248 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.249 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.249 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.249 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.253 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.262 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:08:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:08:06 compute-0 sudo[196915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brxibhqeldyzakskeilesijvfutqtdok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212085.7302828-558-130039314193688/AnsiballZ_stat.py'
Feb 27 17:08:06 compute-0 sudo[196915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:06 compute-0 python3.9[196918]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:08:06 compute-0 sudo[196915]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:06 compute-0 sudo[197041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwmqczzmwncivbnkxjcmclfsxwolodgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212085.7302828-558-130039314193688/AnsiballZ_copy.py'
Feb 27 17:08:06 compute-0 sudo[197041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:06 compute-0 python3.9[197044]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772212085.7302828-558-130039314193688/.source.yaml _original_basename=.iil8ppeu follow=False checksum=f19ac831e06fce6186bd12425c6a568f461f1199 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:06 compute-0 sudo[197041]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:07 compute-0 sudo[197194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbzfyjuqbvxovyhafsnoluromwgxmxzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212087.0464802-573-274070405579866/AnsiballZ_stat.py'
Feb 27 17:08:07 compute-0 sudo[197194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:07 compute-0 python3.9[197197]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:08:07 compute-0 sudo[197194]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:07 compute-0 sudo[197318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyyxbpufsqnwuoizkzkhlvonxpzivmvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212087.0464802-573-274070405579866/AnsiballZ_copy.py'
Feb 27 17:08:07 compute-0 sudo[197318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:08 compute-0 python3.9[197321]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772212087.0464802-573-274070405579866/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:08:08 compute-0 sudo[197318]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:09 compute-0 sudo[197471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihgfkisqoxatunctiaesvnqamznheozd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212088.7558317-594-62946033528744/AnsiballZ_file.py'
Feb 27 17:08:09 compute-0 sudo[197471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:09 compute-0 python3.9[197474]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:09 compute-0 sudo[197471]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:09 compute-0 sudo[197624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixuoryglfizcljmtjursgigryuunccnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212089.5615785-602-204413218200582/AnsiballZ_file.py'
Feb 27 17:08:09 compute-0 sudo[197624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:10 compute-0 python3.9[197627]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:08:10 compute-0 sudo[197624]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:10 compute-0 sudo[197777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kekyyzerqpfdczterolmfwckjbgscsin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212090.2908523-610-260697607961938/AnsiballZ_stat.py'
Feb 27 17:08:10 compute-0 sudo[197777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:10 compute-0 python3.9[197780]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:08:10 compute-0 sudo[197777]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:11 compute-0 sudo[197856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlbfwtdmudnunudnvajjfvfltxyelyjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212090.2908523-610-260697607961938/AnsiballZ_file.py'
Feb 27 17:08:11 compute-0 sudo[197856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:11 compute-0 python3.9[197859]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.f6oniza_ recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:11 compute-0 sudo[197856]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:11 compute-0 python3.9[198009]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:13 compute-0 sudo[198430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qafffixoutaelrapnyflogbihifiztmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212093.5569758-647-136163217213566/AnsiballZ_container_config_data.py'
Feb 27 17:08:13 compute-0 sudo[198430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:14 compute-0 python3.9[198433]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Feb 27 17:08:14 compute-0 sudo[198430]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:14 compute-0 sudo[198583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdynkowqxycpsfrziwynknyyaceqnnmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212094.3837557-658-182487595594226/AnsiballZ_container_config_hash.py'
Feb 27 17:08:14 compute-0 sudo[198583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:14 compute-0 python3.9[198586]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 27 17:08:14 compute-0 sudo[198583]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:15 compute-0 sudo[198736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozsicykbbapvlomcxmfrjlrovsuzdago ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772212095.1249287-668-173276185856820/AnsiballZ_edpm_container_manage.py'
Feb 27 17:08:15 compute-0 sudo[198736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:15 compute-0 python3[198739]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 27 17:08:15 compute-0 podman[198775]: 2026-02-27 17:08:15.902564687 +0000 UTC m=+0.049446352 container create 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 27 17:08:15 compute-0 podman[198775]: 2026-02-27 17:08:15.876216867 +0000 UTC m=+0.023098572 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Feb 27 17:08:15 compute-0 python3[198739]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /:/rootfs:ro --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl --path.rootfs=/rootfs
Feb 27 17:08:16 compute-0 sudo[198736]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:16 compute-0 sudo[198964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wasdbnqurrpynisxqjuzxpojxtvmiooh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212096.1898441-676-172118585541275/AnsiballZ_stat.py'
Feb 27 17:08:16 compute-0 sudo[198964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:16 compute-0 python3.9[198967]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:08:16 compute-0 sudo[198964]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:17 compute-0 sudo[199119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzqorwbfqmaeuibfinbjmialiwiysasc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212096.9509852-685-257425444792363/AnsiballZ_file.py'
Feb 27 17:08:17 compute-0 sudo[199119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:17 compute-0 python3.9[199122]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:17 compute-0 sudo[199119]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:17 compute-0 sudo[199196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwsmrbtryjihblouzdpwfhizduubtqly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212096.9509852-685-257425444792363/AnsiballZ_stat.py'
Feb 27 17:08:17 compute-0 sudo[199196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:17 compute-0 python3.9[199199]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:08:17 compute-0 sudo[199196]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:18 compute-0 sudo[199348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seuwskjsfqafjqhonnfjfnjavfkimokq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212098.0833795-685-8919102100044/AnsiballZ_copy.py'
Feb 27 17:08:18 compute-0 sudo[199348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:18 compute-0 python3.9[199351]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772212098.0833795-685-8919102100044/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:18 compute-0 sudo[199348]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:19 compute-0 sudo[199425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtqurdooeiunlxtkjpyykrxjwyenfzou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212098.0833795-685-8919102100044/AnsiballZ_systemd.py'
Feb 27 17:08:19 compute-0 sudo[199425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:19 compute-0 python3.9[199428]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 17:08:19 compute-0 systemd[1]: Reloading.
Feb 27 17:08:19 compute-0 systemd-rc-local-generator[199459]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:08:19 compute-0 systemd-sysv-generator[199463]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:08:19 compute-0 sudo[199425]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:19 compute-0 sudo[199544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hisxdourmrctxynjmpgcouuddbltriqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212098.0833795-685-8919102100044/AnsiballZ_systemd.py'
Feb 27 17:08:19 compute-0 sudo[199544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:20 compute-0 python3.9[199547]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:08:20 compute-0 systemd[1]: Reloading.
Feb 27 17:08:20 compute-0 systemd-sysv-generator[199583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:08:20 compute-0 systemd-rc-local-generator[199579]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:08:20 compute-0 systemd[1]: Starting node_exporter container...
Feb 27 17:08:20 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:08:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/713f49d54a54491ee47602ef655e59a46e507df930b8a4c49413b0414984d6f7/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 27 17:08:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/713f49d54a54491ee47602ef655e59a46e507df930b8a4c49413b0414984d6f7/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 27 17:08:20 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423.
Feb 27 17:08:20 compute-0 podman[199594]: 2026-02-27 17:08:20.711542866 +0000 UTC m=+0.121976785 container init 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.727Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.727Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.727Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.728Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.728Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.728Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.728Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.728Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.728Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=arp
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=bcache
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=bonding
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=btrfs
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=conntrack
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=cpu
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=cpufreq
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=diskstats
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=edac
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=fibrechannel
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=filefd
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=filesystem
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=infiniband
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=ipvs
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=loadavg
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=mdadm
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=meminfo
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=netclass
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=netdev
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=netstat
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=nfs
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=nfsd
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=nvme
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=schedstat
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=sockstat
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=softnet
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=systemd
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=tapestats
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=udp_queues
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=vmstat
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=xfs
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.729Z caller=node_exporter.go:117 level=info collector=zfs
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.730Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Feb 27 17:08:20 compute-0 node_exporter[199609]: ts=2026-02-27T17:08:20.731Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Feb 27 17:08:20 compute-0 podman[199594]: 2026-02-27 17:08:20.741306927 +0000 UTC m=+0.151740836 container start 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:08:20 compute-0 podman[199594]: node_exporter
Feb 27 17:08:20 compute-0 systemd[1]: Started node_exporter container.
Feb 27 17:08:20 compute-0 sudo[199544]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:20 compute-0 podman[199618]: 2026-02-27 17:08:20.831346448 +0000 UTC m=+0.077404960 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:08:21 compute-0 python3.9[199792]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 27 17:08:21 compute-0 auditd[716]: Audit daemon rotating log files
Feb 27 17:08:22 compute-0 sudo[199942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfjcssfmfwvwynktlrkasixgwafltnmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212102.2437897-730-145335348775845/AnsiballZ_stat.py'
Feb 27 17:08:22 compute-0 sudo[199942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:22 compute-0 python3.9[199945]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:08:22 compute-0 sudo[199942]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:23 compute-0 sudo[200068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbfjtcmdnhptmverzdatljedpnezgdmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212102.2437897-730-145335348775845/AnsiballZ_copy.py'
Feb 27 17:08:23 compute-0 sudo[200068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:23 compute-0 podman[200070]: 2026-02-27 17:08:23.224031 +0000 UTC m=+0.089088309 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 27 17:08:23 compute-0 python3.9[200072]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772212102.2437897-730-145335348775845/.source.yaml _original_basename=.qpp_cnll follow=False checksum=d02242f549f693d6cc28e6319b37239782b25afb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:23 compute-0 sudo[200068]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:23 compute-0 sudo[200240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhmmfqouitrrjrzxgdgooeblsqvcgyuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212103.5947473-745-169445398447963/AnsiballZ_stat.py'
Feb 27 17:08:23 compute-0 sudo[200240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:24 compute-0 python3.9[200243]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:08:24 compute-0 sudo[200240]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:24 compute-0 sudo[200364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlmompfgjjenqlnmmaupavwjafpblued ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212103.5947473-745-169445398447963/AnsiballZ_copy.py'
Feb 27 17:08:24 compute-0 sudo[200364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:24 compute-0 python3.9[200367]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772212103.5947473-745-169445398447963/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:08:24 compute-0 sudo[200364]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:25 compute-0 sudo[200517]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oidnrbzdivbroxsswudvkqeyaqcgzgxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212105.3859046-766-163416578265947/AnsiballZ_file.py'
Feb 27 17:08:25 compute-0 sudo[200517]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:25 compute-0 python3.9[200520]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:25 compute-0 sudo[200517]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:26 compute-0 sudo[200679]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkwpklsmoofakypcshswyhkcllhwwcye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212106.1531193-774-116172944599032/AnsiballZ_file.py'
Feb 27 17:08:26 compute-0 sudo[200679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:26 compute-0 podman[200644]: 2026-02-27 17:08:26.600761388 +0000 UTC m=+0.155179988 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 27 17:08:26 compute-0 python3.9[200689]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:08:26 compute-0 sudo[200679]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:27 compute-0 sudo[200849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnwcnuuegdhawjnnnauasrttzddktxrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212106.923985-782-171366717125248/AnsiballZ_stat.py'
Feb 27 17:08:27 compute-0 sudo[200849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:27 compute-0 python3.9[200852]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:08:27 compute-0 sudo[200849]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:27 compute-0 sudo[200928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clfnkvolbhqazopdzssnsyamwhomzkql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212106.923985-782-171366717125248/AnsiballZ_file.py'
Feb 27 17:08:27 compute-0 sudo[200928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:27 compute-0 python3.9[200931]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.fki61wxp recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:27 compute-0 sudo[200928]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:28 compute-0 python3.9[201081]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:30 compute-0 sudo[201502]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdywnbxyqvhqnsswdozgbtzncminuivi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212110.1288705-819-272338178284488/AnsiballZ_container_config_data.py'
Feb 27 17:08:30 compute-0 sudo[201502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:30 compute-0 python3.9[201505]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Feb 27 17:08:30 compute-0 sudo[201502]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:31 compute-0 sudo[201655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkwtwxqtdaxpmowltckgulfjrlzdzxkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212111.0413902-830-255951507728378/AnsiballZ_container_config_hash.py'
Feb 27 17:08:31 compute-0 sudo[201655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:31 compute-0 python3.9[201658]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 27 17:08:31 compute-0 sudo[201655]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:32 compute-0 sudo[201808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqcvsenzmprbqewjxlmljdhiwwkkxajm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772212111.9192805-840-4237411380946/AnsiballZ_edpm_container_manage.py'
Feb 27 17:08:32 compute-0 sudo[201808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:32 compute-0 python3[201811]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 27 17:08:34 compute-0 podman[201826]: 2026-02-27 17:08:34.067496101 +0000 UTC m=+1.502328975 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 27 17:08:34 compute-0 podman[201922]: 2026-02-27 17:08:34.195824867 +0000 UTC m=+0.040710044 container create 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:08:34 compute-0 podman[201922]: 2026-02-27 17:08:34.173534124 +0000 UTC m=+0.018419341 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Feb 27 17:08:34 compute-0 python3[201811]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Feb 27 17:08:34 compute-0 sudo[201808]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:34 compute-0 podman[201961]: 2026-02-27 17:08:34.471087192 +0000 UTC m=+0.073700862 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 27 17:08:34 compute-0 systemd[1]: b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933-53385ffc935f8394.service: Main process exited, code=exited, status=1/FAILURE
Feb 27 17:08:34 compute-0 systemd[1]: b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933-53385ffc935f8394.service: Failed with result 'exit-code'.
Feb 27 17:08:34 compute-0 sudo[202130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehboafaaufrihodrnrqxjjgrnjtbzvav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212114.5122943-848-260664696461467/AnsiballZ_stat.py'
Feb 27 17:08:34 compute-0 sudo[202130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:34 compute-0 python3.9[202133]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:08:34 compute-0 sudo[202130]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:35 compute-0 sudo[202285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiinywslsxhciwgmwqnputevarhcukpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212115.2782323-857-246789412241947/AnsiballZ_file.py'
Feb 27 17:08:35 compute-0 sudo[202285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:35 compute-0 python3.9[202288]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:35 compute-0 sudo[202285]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:36 compute-0 sudo[202362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghbiwolmcylqsxwigbkbauvmvkomzzxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212115.2782323-857-246789412241947/AnsiballZ_stat.py'
Feb 27 17:08:36 compute-0 sudo[202362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:36 compute-0 python3.9[202365]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:08:36 compute-0 sudo[202362]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:36 compute-0 sudo[202514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrtfyxhvjcvpjujlbaqbvssxxtklpcsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212116.3299549-857-207344100565085/AnsiballZ_copy.py'
Feb 27 17:08:36 compute-0 sudo[202514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:37 compute-0 python3.9[202517]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772212116.3299549-857-207344100565085/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:37 compute-0 sudo[202514]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:37 compute-0 sudo[202591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhjnhpiuvbkpgkcermpvalcbhqzfleoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212116.3299549-857-207344100565085/AnsiballZ_systemd.py'
Feb 27 17:08:37 compute-0 sudo[202591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:37 compute-0 python3.9[202594]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 17:08:37 compute-0 systemd[1]: Reloading.
Feb 27 17:08:37 compute-0 systemd-sysv-generator[202624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:08:37 compute-0 systemd-rc-local-generator[202617]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:08:37 compute-0 sudo[202591]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:38 compute-0 sudo[202710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buefyuaidfthwmpihyuwiuijwsvnqfhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212116.3299549-857-207344100565085/AnsiballZ_systemd.py'
Feb 27 17:08:38 compute-0 sudo[202710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:38 compute-0 python3.9[202713]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:08:38 compute-0 systemd[1]: Reloading.
Feb 27 17:08:38 compute-0 systemd-sysv-generator[202741]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:08:38 compute-0 systemd-rc-local-generator[202737]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:08:38 compute-0 systemd[1]: Starting podman_exporter container...
Feb 27 17:08:39 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:08:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/790357e4a7f14ca1263b161ff71fc38887a5d993fc89fbade6a79d252ba544bd/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 27 17:08:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/790357e4a7f14ca1263b161ff71fc38887a5d993fc89fbade6a79d252ba544bd/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 27 17:08:39 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7.
Feb 27 17:08:39 compute-0 podman[202759]: 2026-02-27 17:08:39.230909037 +0000 UTC m=+0.228942280 container init 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:08:39 compute-0 podman_exporter[202774]: ts=2026-02-27T17:08:39.255Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Feb 27 17:08:39 compute-0 podman_exporter[202774]: ts=2026-02-27T17:08:39.255Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Feb 27 17:08:39 compute-0 podman_exporter[202774]: ts=2026-02-27T17:08:39.255Z caller=handler.go:94 level=info msg="enabled collectors"
Feb 27 17:08:39 compute-0 podman_exporter[202774]: ts=2026-02-27T17:08:39.255Z caller=handler.go:105 level=info collector=container
Feb 27 17:08:39 compute-0 podman[202759]: 2026-02-27 17:08:39.264701334 +0000 UTC m=+0.262734487 container start 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:08:39 compute-0 podman[202759]: podman_exporter
Feb 27 17:08:39 compute-0 systemd[1]: Starting Podman API Service...
Feb 27 17:08:39 compute-0 systemd[1]: Started podman_exporter container.
Feb 27 17:08:39 compute-0 systemd[1]: Started Podman API Service.
Feb 27 17:08:39 compute-0 podman[202785]: time="2026-02-27T17:08:39Z" level=info msg="/usr/bin/podman filtering at log level info"
Feb 27 17:08:39 compute-0 podman[202785]: time="2026-02-27T17:08:39Z" level=info msg="Setting parallel job count to 25"
Feb 27 17:08:39 compute-0 podman[202785]: time="2026-02-27T17:08:39Z" level=info msg="Using sqlite as database backend"
Feb 27 17:08:39 compute-0 podman[202785]: time="2026-02-27T17:08:39Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Feb 27 17:08:39 compute-0 podman[202785]: time="2026-02-27T17:08:39Z" level=info msg="Using systemd socket activation to determine API endpoint"
Feb 27 17:08:39 compute-0 podman[202785]: time="2026-02-27T17:08:39Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Feb 27 17:08:39 compute-0 sudo[202710]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:39 compute-0 podman[202785]: @ - - [27/Feb/2026:17:08:39 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Feb 27 17:08:39 compute-0 podman[202785]: time="2026-02-27T17:08:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 27 17:08:39 compute-0 podman[202785]: @ - - [27/Feb/2026:17:08:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 18614 "" "Go-http-client/1.1"
Feb 27 17:08:39 compute-0 podman_exporter[202774]: ts=2026-02-27T17:08:39.362Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Feb 27 17:08:39 compute-0 podman_exporter[202774]: ts=2026-02-27T17:08:39.363Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Feb 27 17:08:39 compute-0 podman_exporter[202774]: ts=2026-02-27T17:08:39.364Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Feb 27 17:08:39 compute-0 podman[202783]: 2026-02-27 17:08:39.374019825 +0000 UTC m=+0.096882755 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:08:39 compute-0 systemd[1]: 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7-638bef23edb1e64e.service: Main process exited, code=exited, status=1/FAILURE
Feb 27 17:08:39 compute-0 systemd[1]: 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7-638bef23edb1e64e.service: Failed with result 'exit-code'.
Feb 27 17:08:40 compute-0 python3.9[202968]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 27 17:08:41 compute-0 sudo[203118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlfhekqkkyrnromjswgavicyrrgqgizw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212120.7900302-902-3960669309754/AnsiballZ_stat.py'
Feb 27 17:08:41 compute-0 sudo[203118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:41 compute-0 python3.9[203121]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:08:41 compute-0 sudo[203118]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:41 compute-0 sudo[203244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tbvlapttpnxypdllpepwuztzpasygvkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212120.7900302-902-3960669309754/AnsiballZ_copy.py'
Feb 27 17:08:41 compute-0 sudo[203244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:42 compute-0 python3.9[203247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772212120.7900302-902-3960669309754/.source.yaml _original_basename=.3g9krps4 follow=False checksum=355487622cc80107dcf81154047d9f58f6f708dd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:42 compute-0 sudo[203244]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:42 compute-0 sudo[203397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjtlfmssbipypichpgjxygonffgqifow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212122.4067338-917-58173035819240/AnsiballZ_stat.py'
Feb 27 17:08:42 compute-0 sudo[203397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:42 compute-0 python3.9[203400]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:08:42 compute-0 sudo[203397]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:43 compute-0 sudo[203521]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olycwqnadhhcfpvkhamyjnommfslgamd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212122.4067338-917-58173035819240/AnsiballZ_copy.py'
Feb 27 17:08:43 compute-0 sudo[203521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:43 compute-0 python3.9[203524]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1772212122.4067338-917-58173035819240/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:08:43 compute-0 sudo[203521]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:44 compute-0 sudo[203674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaemlflbeumkbpxgbddcfkodftbvutqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212124.2345955-938-119814422459544/AnsiballZ_file.py'
Feb 27 17:08:44 compute-0 sudo[203674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:44 compute-0 python3.9[203677]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:44 compute-0 sudo[203674]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:45 compute-0 sudo[203827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyyvjnaxgedrtyivtngypkddxwsgdwpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212125.0008397-946-280032929303151/AnsiballZ_file.py'
Feb 27 17:08:45 compute-0 sudo[203827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:45 compute-0 python3.9[203830]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 27 17:08:45 compute-0 sudo[203827]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:46 compute-0 sudo[203980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slcmaxmuntfyrtbehqixbbujthttbejj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212125.743225-954-234716193739256/AnsiballZ_stat.py'
Feb 27 17:08:46 compute-0 sudo[203980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:46 compute-0 python3.9[203983]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:08:46 compute-0 sudo[203980]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:46 compute-0 sudo[204059]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivvxdgajbbxnnewotqepgvpqznshjkwh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212125.743225-954-234716193739256/AnsiballZ_file.py'
Feb 27 17:08:46 compute-0 sudo[204059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:46 compute-0 python3.9[204062]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.ulcfj2qb recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:46 compute-0 sudo[204059]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:08:47.081 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:08:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:08:47.082 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:08:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:08:47.082 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:08:47 compute-0 python3.9[204212]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:49 compute-0 sudo[204633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdaapxkwnffwgmxtxuxxfmvcqmoavjrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212129.4685745-991-160332562180735/AnsiballZ_container_config_data.py'
Feb 27 17:08:49 compute-0 sudo[204633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:49 compute-0 python3.9[204636]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Feb 27 17:08:49 compute-0 sudo[204633]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:50 compute-0 sudo[204786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhxdjxcwujghgbmchmlefhbfqfifuzmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212130.3798726-1002-137355269175412/AnsiballZ_container_config_hash.py'
Feb 27 17:08:50 compute-0 sudo[204786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:50 compute-0 python3.9[204789]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 27 17:08:50 compute-0 sudo[204786]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:51 compute-0 podman[204790]: 2026-02-27 17:08:51.014010603 +0000 UTC m=+0.070006214 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 27 17:08:51 compute-0 sudo[204963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmfyzmesyfdfckcomebijddwdaywxhza ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772212131.2554615-1012-72140770109008/AnsiballZ_edpm_container_manage.py'
Feb 27 17:08:51 compute-0 sudo[204963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:51 compute-0 python3[204966]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 27 17:08:53 compute-0 podman[205022]: 2026-02-27 17:08:53.516395955 +0000 UTC m=+0.092036399 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 27 17:08:54 compute-0 podman[204979]: 2026-02-27 17:08:54.574370036 +0000 UTC m=+2.529386840 image pull ba17a9079b86bdce2cd9e03cad5d6d4d255cc298efd741b09239c34192e5621b quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 27 17:08:54 compute-0 podman[205094]: 2026-02-27 17:08:54.68467911 +0000 UTC m=+0.044855411 container create 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal)
Feb 27 17:08:54 compute-0 podman[205094]: 2026-02-27 17:08:54.657971353 +0000 UTC m=+0.018147644 image pull ba17a9079b86bdce2cd9e03cad5d6d4d255cc298efd741b09239c34192e5621b quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 27 17:08:54 compute-0 python3[204966]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Feb 27 17:08:54 compute-0 sudo[204963]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:55 compute-0 sudo[205282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skktridzpqjoovejbovivobirgzzvrii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212135.0445702-1020-193094142814217/AnsiballZ_stat.py'
Feb 27 17:08:55 compute-0 sudo[205282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:55 compute-0 python3.9[205285]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:08:55 compute-0 sudo[205282]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:56 compute-0 sudo[205437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fplpafpwueghappvaubxoonxaatkipub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212135.8590903-1029-96541227834745/AnsiballZ_file.py'
Feb 27 17:08:56 compute-0 sudo[205437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:56 compute-0 python3.9[205440]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:56 compute-0 sudo[205437]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:56 compute-0 sudo[205514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbueowpupmpecltiogntyxmytjaxdrnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212135.8590903-1029-96541227834745/AnsiballZ_stat.py'
Feb 27 17:08:56 compute-0 sudo[205514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:56 compute-0 podman[205516]: 2026-02-27 17:08:56.770484133 +0000 UTC m=+0.102510869 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2)
Feb 27 17:08:56 compute-0 python3.9[205518]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:08:56 compute-0 sudo[205514]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:57 compute-0 sudo[205694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omwlspkubnfhhmcbgvrcevpzemwqhfyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212136.969978-1029-237073368421289/AnsiballZ_copy.py'
Feb 27 17:08:57 compute-0 sudo[205694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:57 compute-0 python3.9[205697]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1772212136.969978-1029-237073368421289/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:08:57 compute-0 sudo[205694]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:57 compute-0 sudo[205771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkrcvahpmanxqlfggtbswvqpkjosprku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212136.969978-1029-237073368421289/AnsiballZ_systemd.py'
Feb 27 17:08:57 compute-0 sudo[205771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:58 compute-0 python3.9[205774]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 27 17:08:58 compute-0 systemd[1]: Reloading.
Feb 27 17:08:58 compute-0 systemd-rc-local-generator[205796]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:08:58 compute-0 systemd-sysv-generator[205799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:08:58 compute-0 sudo[205771]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:58 compute-0 sudo[205890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuqefkdsdtknhhapxakyiyjlwkklrgvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212136.969978-1029-237073368421289/AnsiballZ_systemd.py'
Feb 27 17:08:58 compute-0 sudo[205890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:08:59 compute-0 python3.9[205893]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 27 17:08:59 compute-0 systemd[1]: Reloading.
Feb 27 17:08:59 compute-0 systemd-rc-local-generator[205926]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 27 17:08:59 compute-0 systemd-sysv-generator[205931]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 27 17:08:59 compute-0 systemd[1]: Starting openstack_network_exporter container...
Feb 27 17:08:59 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:08:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52a35071a506e35246f5f934a319cd7babaebed7eb8c845c2e2b74d12b30972a/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 27 17:08:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52a35071a506e35246f5f934a319cd7babaebed7eb8c845c2e2b74d12b30972a/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Feb 27 17:08:59 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52a35071a506e35246f5f934a319cd7babaebed7eb8c845c2e2b74d12b30972a/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 27 17:08:59 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4.
Feb 27 17:08:59 compute-0 podman[205941]: 2026-02-27 17:08:59.722070826 +0000 UTC m=+0.174942730 container init 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7)
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: INFO    17:08:59 main.go:48: registering *bridge.Collector
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: INFO    17:08:59 main.go:48: registering *coverage.Collector
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: INFO    17:08:59 main.go:48: registering *datapath.Collector
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: INFO    17:08:59 main.go:48: registering *iface.Collector
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: INFO    17:08:59 main.go:48: registering *memory.Collector
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: INFO    17:08:59 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: INFO    17:08:59 main.go:48: registering *ovn.Collector
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: INFO    17:08:59 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: INFO    17:08:59 main.go:48: registering *pmd_perf.Collector
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: INFO    17:08:59 main.go:48: registering *pmd_rxq.Collector
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: INFO    17:08:59 main.go:48: registering *vswitch.Collector
Feb 27 17:08:59 compute-0 openstack_network_exporter[205956]: NOTICE  17:08:59 main.go:76: listening on https://:9105/metrics
Feb 27 17:08:59 compute-0 podman[205941]: 2026-02-27 17:08:59.767237295 +0000 UTC m=+0.220109149 container start 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.openshift.expose-services=, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Feb 27 17:08:59 compute-0 podman[205941]: openstack_network_exporter
Feb 27 17:08:59 compute-0 systemd[1]: Started openstack_network_exporter container.
Feb 27 17:08:59 compute-0 sudo[205890]: pam_unix(sudo:session): session closed for user root
Feb 27 17:08:59 compute-0 podman[205966]: 2026-02-27 17:08:59.884363803 +0000 UTC m=+0.101249540 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible)
Feb 27 17:09:00 compute-0 python3.9[206138]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 27 17:09:01 compute-0 sudo[206288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfbmdhljuvlcgpcavuyoizmcrxvzepnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212141.156387-1074-194209347877371/AnsiballZ_stat.py'
Feb 27 17:09:01 compute-0 sudo[206288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:01 compute-0 python3.9[206291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:09:01 compute-0 sudo[206288]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:02 compute-0 sudo[206414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-youpykiojnowsvnudmcmzajvlkytssry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212141.156387-1074-194209347877371/AnsiballZ_copy.py'
Feb 27 17:09:02 compute-0 sudo[206414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:02 compute-0 python3.9[206417]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1772212141.156387-1074-194209347877371/.source.yaml _original_basename=.fi8khvmu follow=False checksum=cdea90c52e88a40241f3dd32bda34503865c0089 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:02 compute-0 sudo[206414]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:02 compute-0 sudo[206567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkecizbmxqcdjkvtyunsnzwqwaxebohr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212142.6112049-1089-192095940006664/AnsiballZ_find.py'
Feb 27 17:09:02 compute-0 sudo[206567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:03 compute-0 nova_compute[186840]: 2026-02-27 17:09:03.055 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:09:03 compute-0 nova_compute[186840]: 2026-02-27 17:09:03.085 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:09:03 compute-0 python3.9[206570]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 27 17:09:03 compute-0 sudo[206567]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:03 compute-0 nova_compute[186840]: 2026-02-27 17:09:03.712 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:09:03 compute-0 nova_compute[186840]: 2026-02-27 17:09:03.715 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:09:03 compute-0 nova_compute[186840]: 2026-02-27 17:09:03.716 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:09:03 compute-0 nova_compute[186840]: 2026-02-27 17:09:03.746 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:09:03 compute-0 nova_compute[186840]: 2026-02-27 17:09:03.746 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:09:03 compute-0 nova_compute[186840]: 2026-02-27 17:09:03.747 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:09:04 compute-0 sudo[206720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uagyedbcdstjlllxiqxvijpijxyoquuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212143.5834906-1099-27654118772691/AnsiballZ_podman_container_info.py'
Feb 27 17:09:04 compute-0 sudo[206720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:04 compute-0 python3.9[206723]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Feb 27 17:09:04 compute-0 sudo[206720]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:04 compute-0 podman[206737]: 2026-02-27 17:09:04.589698936 +0000 UTC m=+0.063750804 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:09:04 compute-0 systemd[1]: b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933-53385ffc935f8394.service: Main process exited, code=exited, status=1/FAILURE
Feb 27 17:09:04 compute-0 systemd[1]: b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933-53385ffc935f8394.service: Failed with result 'exit-code'.
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.740 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.740 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.742 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.743 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.925 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.926 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5772MB free_disk=73.18021392822266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.926 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.926 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.994 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:09:04 compute-0 nova_compute[186840]: 2026-02-27 17:09:04.995 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:09:05 compute-0 nova_compute[186840]: 2026-02-27 17:09:05.025 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:09:05 compute-0 nova_compute[186840]: 2026-02-27 17:09:05.042 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:09:05 compute-0 nova_compute[186840]: 2026-02-27 17:09:05.045 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:09:05 compute-0 nova_compute[186840]: 2026-02-27 17:09:05.045 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:09:05 compute-0 sudo[206906]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrifqjuebnlzevhekjdaqyignlnikcyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212144.706553-1107-252672234653493/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:05 compute-0 sudo[206906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:05 compute-0 python3.9[206909]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:05 compute-0 systemd[1]: Started libpod-conmon-958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580.scope.
Feb 27 17:09:05 compute-0 podman[206910]: 2026-02-27 17:09:05.635025644 +0000 UTC m=+0.181560898 container exec 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 27 17:09:05 compute-0 podman[206930]: 2026-02-27 17:09:05.74452779 +0000 UTC m=+0.094894538 container exec_died 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260223)
Feb 27 17:09:05 compute-0 podman[206910]: 2026-02-27 17:09:05.752199243 +0000 UTC m=+0.298734537 container exec_died 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 27 17:09:05 compute-0 systemd[1]: libpod-conmon-958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580.scope: Deactivated successfully.
Feb 27 17:09:05 compute-0 sudo[206906]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:06 compute-0 sudo[207092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbekvtbuygtzbfarqhsapdrbuxrmdbid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212145.9791818-1115-27976137353532/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:06 compute-0 sudo[207092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:06 compute-0 python3.9[207095]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:06 compute-0 systemd[1]: Started libpod-conmon-958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580.scope.
Feb 27 17:09:06 compute-0 podman[207096]: 2026-02-27 17:09:06.598456837 +0000 UTC m=+0.079238944 container exec 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, tcib_managed=true)
Feb 27 17:09:06 compute-0 podman[207096]: 2026-02-27 17:09:06.633607797 +0000 UTC m=+0.114389874 container exec_died 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:09:06 compute-0 systemd[1]: libpod-conmon-958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580.scope: Deactivated successfully.
Feb 27 17:09:06 compute-0 sudo[207092]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:07 compute-0 sudo[207277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwzjohdzpghmdrhrilekxawffqhskyio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212146.877166-1123-102020141517485/AnsiballZ_file.py'
Feb 27 17:09:07 compute-0 sudo[207277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:07 compute-0 python3.9[207280]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:07 compute-0 sudo[207277]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:07 compute-0 sudo[207430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypztnyhfjbwahkxryqnifnwciixaxwvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212147.6534176-1132-262267748020607/AnsiballZ_podman_container_info.py'
Feb 27 17:09:07 compute-0 sudo[207430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:08 compute-0 python3.9[207433]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Feb 27 17:09:08 compute-0 sudo[207430]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:08 compute-0 sudo[207596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elyejbxpdjbnyxrnelxfuazilkhpbshc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212148.4530575-1140-83517743707931/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:08 compute-0 sudo[207596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:09 compute-0 python3.9[207599]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:09 compute-0 systemd[1]: Started libpod-conmon-adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366.scope.
Feb 27 17:09:09 compute-0 podman[207600]: 2026-02-27 17:09:09.109345804 +0000 UTC m=+0.085559615 container exec adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 27 17:09:09 compute-0 podman[207600]: 2026-02-27 17:09:09.14686869 +0000 UTC m=+0.123082461 container exec_died adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent)
Feb 27 17:09:09 compute-0 systemd[1]: libpod-conmon-adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366.scope: Deactivated successfully.
Feb 27 17:09:09 compute-0 sudo[207596]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:09 compute-0 podman[207732]: 2026-02-27 17:09:09.695336022 +0000 UTC m=+0.089384586 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:09:09 compute-0 sudo[207805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkosxmhyysmyjxmgkqryoywmhunnnfmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212149.3957512-1148-118465143337259/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:09 compute-0 sudo[207805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:09 compute-0 python3.9[207808]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:10 compute-0 systemd[1]: Started libpod-conmon-adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366.scope.
Feb 27 17:09:10 compute-0 podman[207809]: 2026-02-27 17:09:10.018911051 +0000 UTC m=+0.087802679 container exec adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 27 17:09:10 compute-0 podman[207809]: 2026-02-27 17:09:10.055079175 +0000 UTC m=+0.123970793 container exec_died adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 27 17:09:10 compute-0 systemd[1]: libpod-conmon-adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366.scope: Deactivated successfully.
Feb 27 17:09:10 compute-0 sudo[207805]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:10 compute-0 sudo[207991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aevzfeosytsonhybuvscyvfmvphxrngy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212150.311522-1156-221556166688004/AnsiballZ_file.py'
Feb 27 17:09:10 compute-0 sudo[207991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:10 compute-0 python3.9[207994]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:10 compute-0 sudo[207991]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:11 compute-0 sudo[208144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eromuqykmsbygxoleghwwdfjmxqkcmbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212151.012376-1165-256014325426719/AnsiballZ_podman_container_info.py'
Feb 27 17:09:11 compute-0 sudo[208144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:11 compute-0 python3.9[208147]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Feb 27 17:09:11 compute-0 sudo[208144]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:12 compute-0 sudo[208311]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqcercdhtbpsiuajxaibmejiloknzcsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212151.8409798-1173-251098537010859/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:12 compute-0 sudo[208311]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:12 compute-0 python3.9[208314]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:12 compute-0 systemd[1]: Started libpod-conmon-b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933.scope.
Feb 27 17:09:12 compute-0 podman[208315]: 2026-02-27 17:09:12.475349767 +0000 UTC m=+0.093666319 container exec b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Feb 27 17:09:12 compute-0 podman[208315]: 2026-02-27 17:09:12.510452785 +0000 UTC m=+0.128769397 container exec_died b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 27 17:09:12 compute-0 systemd[1]: libpod-conmon-b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933.scope: Deactivated successfully.
Feb 27 17:09:12 compute-0 sudo[208311]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:13 compute-0 sudo[208495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pumhjafkyubiqsrxvytyldcesdnsluiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212152.769941-1181-172708462343006/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:13 compute-0 sudo[208495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:13 compute-0 python3.9[208498]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:13 compute-0 systemd[1]: Started libpod-conmon-b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933.scope.
Feb 27 17:09:13 compute-0 podman[208499]: 2026-02-27 17:09:13.369301609 +0000 UTC m=+0.070345111 container exec b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 27 17:09:13 compute-0 podman[208499]: 2026-02-27 17:09:13.399760067 +0000 UTC m=+0.100803559 container exec_died b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS)
Feb 27 17:09:13 compute-0 systemd[1]: libpod-conmon-b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933.scope: Deactivated successfully.
Feb 27 17:09:13 compute-0 sudo[208495]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:13 compute-0 sudo[208681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wibvkdujqkbglgaoqrmlqlvszjwkqjdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212153.6798387-1189-11911692897366/AnsiballZ_file.py'
Feb 27 17:09:13 compute-0 sudo[208681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:14 compute-0 python3.9[208684]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:14 compute-0 sudo[208681]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:14 compute-0 sudo[208834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myzaanelohxveknuvblduewtpqdeyhrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212154.4843943-1198-247151754555523/AnsiballZ_podman_container_info.py'
Feb 27 17:09:14 compute-0 sudo[208834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:14 compute-0 python3.9[208837]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Feb 27 17:09:15 compute-0 sudo[208834]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:15 compute-0 sudo[209000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duiisbcyktmpknlcxrfzcprwafntgemy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212155.2609963-1206-57284011262047/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:15 compute-0 sudo[209000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:15 compute-0 python3.9[209003]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:15 compute-0 systemd[1]: Started libpod-conmon-98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423.scope.
Feb 27 17:09:15 compute-0 podman[209004]: 2026-02-27 17:09:15.828303557 +0000 UTC m=+0.088658329 container exec 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:09:15 compute-0 podman[209004]: 2026-02-27 17:09:15.863847536 +0000 UTC m=+0.124202268 container exec_died 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:09:15 compute-0 sudo[209000]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:15 compute-0 systemd[1]: libpod-conmon-98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423.scope: Deactivated successfully.
Feb 27 17:09:16 compute-0 sudo[209184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjpatzdpvnkvacfrnevnqteuakgarwox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212156.1036546-1214-413758190506/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:16 compute-0 sudo[209184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:16 compute-0 python3.9[209187]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:16 compute-0 systemd[1]: Started libpod-conmon-98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423.scope.
Feb 27 17:09:16 compute-0 podman[209188]: 2026-02-27 17:09:16.708229204 +0000 UTC m=+0.082620334 container exec 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 27 17:09:16 compute-0 podman[209188]: 2026-02-27 17:09:16.738305043 +0000 UTC m=+0.112696213 container exec_died 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:09:16 compute-0 systemd[1]: libpod-conmon-98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423.scope: Deactivated successfully.
Feb 27 17:09:16 compute-0 sudo[209184]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:17 compute-0 sudo[209373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djxjdpqalxspnkdenjcnicrmwzpsxcfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212157.000774-1222-129104813350556/AnsiballZ_file.py'
Feb 27 17:09:17 compute-0 sudo[209373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:17 compute-0 python3.9[209376]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:17 compute-0 sudo[209373]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:18 compute-0 sudo[209526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqvnomxatidlurzdvemevbaogaygjelg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212157.7628644-1231-162265593270871/AnsiballZ_podman_container_info.py'
Feb 27 17:09:18 compute-0 sudo[209526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:18 compute-0 python3.9[209529]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Feb 27 17:09:18 compute-0 sudo[209526]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:18 compute-0 sudo[209693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmferaliohfgeocmkakugzispqfehnnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212158.7056463-1239-178350566904278/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:18 compute-0 sudo[209693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:19 compute-0 python3.9[209696]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:19 compute-0 systemd[1]: Started libpod-conmon-284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7.scope.
Feb 27 17:09:19 compute-0 podman[209697]: 2026-02-27 17:09:19.288885217 +0000 UTC m=+0.085282578 container exec 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:09:19 compute-0 podman[209697]: 2026-02-27 17:09:19.318559646 +0000 UTC m=+0.114956917 container exec_died 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 27 17:09:19 compute-0 systemd[1]: libpod-conmon-284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7.scope: Deactivated successfully.
Feb 27 17:09:19 compute-0 sudo[209693]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:19 compute-0 sudo[209879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhibftgdjiopgiqnmtfvxajzvwoutblh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212159.5540137-1247-217374237471029/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:19 compute-0 sudo[209879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:20 compute-0 python3.9[209882]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:20 compute-0 systemd[1]: Started libpod-conmon-284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7.scope.
Feb 27 17:09:20 compute-0 podman[209883]: 2026-02-27 17:09:20.179378024 +0000 UTC m=+0.093646475 container exec 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:09:20 compute-0 podman[209883]: 2026-02-27 17:09:20.212645845 +0000 UTC m=+0.126914246 container exec_died 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:09:20 compute-0 systemd[1]: libpod-conmon-284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7.scope: Deactivated successfully.
Feb 27 17:09:20 compute-0 sudo[209879]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:20 compute-0 sudo[210065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ydhtbyhadpnfthopvqctpigrmlmbrgcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212160.4551704-1255-4457792840597/AnsiballZ_file.py'
Feb 27 17:09:20 compute-0 sudo[210065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:20 compute-0 python3.9[210068]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:20 compute-0 sudo[210065]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:21 compute-0 sudo[210228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yiqprxuaprqmapfnyicejjihiujmfsmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212161.1415262-1264-66446075398928/AnsiballZ_podman_container_info.py'
Feb 27 17:09:21 compute-0 sudo[210228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:21 compute-0 podman[210192]: 2026-02-27 17:09:21.472933972 +0000 UTC m=+0.063994630 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 27 17:09:21 compute-0 python3.9[210238]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Feb 27 17:09:21 compute-0 sudo[210228]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:22 compute-0 sudo[210408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvqwrrpnhognepelsktejneljimttmzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212161.96222-1272-157655118694069/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:22 compute-0 sudo[210408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:22 compute-0 python3.9[210411]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:22 compute-0 systemd[1]: Started libpod-conmon-55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4.scope.
Feb 27 17:09:22 compute-0 podman[210412]: 2026-02-27 17:09:22.606471922 +0000 UTC m=+0.089858086 container exec 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=openstack_network_exporter, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, release=1770267347, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.buildah.version=1.33.7)
Feb 27 17:09:22 compute-0 podman[210412]: 2026-02-27 17:09:22.641705412 +0000 UTC m=+0.125091496 container exec_died 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, vendor=Red Hat, Inc., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 27 17:09:22 compute-0 systemd[1]: libpod-conmon-55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4.scope: Deactivated successfully.
Feb 27 17:09:22 compute-0 sudo[210408]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:23 compute-0 sudo[210594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxhpalmykngngbuicztffmppyatrsbry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212162.876109-1280-107660536811187/AnsiballZ_podman_container_exec.py'
Feb 27 17:09:23 compute-0 sudo[210594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:23 compute-0 python3.9[210597]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 27 17:09:23 compute-0 systemd[1]: Started libpod-conmon-55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4.scope.
Feb 27 17:09:23 compute-0 podman[210598]: 2026-02-27 17:09:23.484537668 +0000 UTC m=+0.076665826 container exec 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, name=ubi9/ubi-minimal, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.expose-services=)
Feb 27 17:09:23 compute-0 podman[210598]: 2026-02-27 17:09:23.51861787 +0000 UTC m=+0.110745988 container exec_died 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal)
Feb 27 17:09:23 compute-0 sudo[210594]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:23 compute-0 systemd[1]: libpod-conmon-55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4.scope: Deactivated successfully.
Feb 27 17:09:23 compute-0 podman[210628]: 2026-02-27 17:09:23.657209592 +0000 UTC m=+0.072926773 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 27 17:09:24 compute-0 sudo[210797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmqrjrqojbraehlnwnxbglsggwdgtiuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212163.7227404-1288-221972340224913/AnsiballZ_file.py'
Feb 27 17:09:24 compute-0 sudo[210797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:24 compute-0 python3.9[210800]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:24 compute-0 sudo[210797]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:24 compute-0 sudo[210950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etkjzevoqxcgpyjuanzqynwxoxfudhwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212164.4930599-1297-53281934213653/AnsiballZ_file.py'
Feb 27 17:09:24 compute-0 sudo[210950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:24 compute-0 python3.9[210953]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:25 compute-0 sudo[210950]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:25 compute-0 sudo[211103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pybiqcxoqiuvqmzyevnayhajokuxpluh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212165.2101648-1305-274126475075047/AnsiballZ_stat.py'
Feb 27 17:09:25 compute-0 sudo[211103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:25 compute-0 python3.9[211106]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:09:25 compute-0 sudo[211103]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:26 compute-0 sudo[211227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-alueauwqtbezcfuhegztutexhykuymgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212165.2101648-1305-274126475075047/AnsiballZ_copy.py'
Feb 27 17:09:26 compute-0 sudo[211227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:26 compute-0 python3.9[211230]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1772212165.2101648-1305-274126475075047/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:26 compute-0 sudo[211227]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:26 compute-0 sudo[211391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svicxctjrofjdsujmejnfttordvukmli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212166.6000054-1321-62505404822394/AnsiballZ_file.py'
Feb 27 17:09:26 compute-0 sudo[211391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:26 compute-0 podman[211354]: 2026-02-27 17:09:26.935692931 +0000 UTC m=+0.104149712 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 27 17:09:27 compute-0 python3.9[211398]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:27 compute-0 sudo[211391]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:27 compute-0 sudo[211559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egsuliymfqbweseaivtlewaaqaouqexo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212167.3300078-1329-76462020671213/AnsiballZ_stat.py'
Feb 27 17:09:27 compute-0 sudo[211559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:27 compute-0 python3.9[211562]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:09:27 compute-0 sudo[211559]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:28 compute-0 sudo[211638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipjdwgcxhzopklwzfsfqttgcvldzoriu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212167.3300078-1329-76462020671213/AnsiballZ_file.py'
Feb 27 17:09:28 compute-0 sudo[211638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:28 compute-0 python3.9[211641]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:28 compute-0 sudo[211638]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:28 compute-0 sudo[211791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stbnwcnmachwnqsvbuhtmftrcofciukc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212168.557849-1341-21037552793556/AnsiballZ_stat.py'
Feb 27 17:09:28 compute-0 sudo[211791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:29 compute-0 python3.9[211794]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:09:29 compute-0 sudo[211791]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:29 compute-0 sudo[211870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goekpoplxiuwrthvadhhaikzencgifpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212168.557849-1341-21037552793556/AnsiballZ_file.py'
Feb 27 17:09:29 compute-0 sudo[211870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:29 compute-0 python3.9[211873]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=._btm_m2c recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:29 compute-0 sudo[211870]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:30 compute-0 sudo[212036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxqvcldnouktosyzyajbcmfopchbueoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212169.7628016-1353-63673933967185/AnsiballZ_stat.py'
Feb 27 17:09:30 compute-0 sudo[212036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:30 compute-0 podman[211997]: 2026-02-27 17:09:30.088462119 +0000 UTC m=+0.064755019 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, version=9.7, config_id=openstack_network_exporter)
Feb 27 17:09:30 compute-0 python3.9[212047]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:09:30 compute-0 sudo[212036]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:30 compute-0 sudo[212123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcslnnplzadkmqxxtrcdpjbpgndthesx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212169.7628016-1353-63673933967185/AnsiballZ_file.py'
Feb 27 17:09:30 compute-0 sudo[212123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:30 compute-0 python3.9[212126]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:30 compute-0 sudo[212123]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:31 compute-0 sudo[212276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-faknkfugbirogiyhknoikpmdrqjuokuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212170.9342897-1366-3641880807777/AnsiballZ_command.py'
Feb 27 17:09:31 compute-0 sudo[212276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:31 compute-0 python3.9[212279]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:09:31 compute-0 sudo[212276]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:32 compute-0 sudo[212430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvomtlhjietsnfkgpejcaqzzwyelizdf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1772212171.721337-1374-114345190268933/AnsiballZ_edpm_nftables_from_files.py'
Feb 27 17:09:32 compute-0 sudo[212430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:32 compute-0 python3[212433]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 27 17:09:32 compute-0 sudo[212430]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:32 compute-0 sudo[212583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldkayvcdhssjiwvhomtrrzpulbypcnme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212172.6257863-1382-213128084202405/AnsiballZ_stat.py'
Feb 27 17:09:32 compute-0 sudo[212583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:33 compute-0 python3.9[212586]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:09:33 compute-0 sudo[212583]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:33 compute-0 sudo[212662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlivquiwxjmssmubuedalnoeqgullnei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212172.6257863-1382-213128084202405/AnsiballZ_file.py'
Feb 27 17:09:33 compute-0 sudo[212662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:33 compute-0 python3.9[212665]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:33 compute-0 sudo[212662]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:34 compute-0 sudo[212815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omfmbimbdfodmvtbrzlovfyflhuvewhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212173.8588042-1394-269053544088332/AnsiballZ_stat.py'
Feb 27 17:09:34 compute-0 sudo[212815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:34 compute-0 python3.9[212818]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:09:34 compute-0 sudo[212815]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:34 compute-0 sudo[212894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zokgtrwrsbexyyafiynvazvwvmedlhyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212173.8588042-1394-269053544088332/AnsiballZ_file.py'
Feb 27 17:09:34 compute-0 sudo[212894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:34 compute-0 podman[212896]: 2026-02-27 17:09:34.701326675 +0000 UTC m=+0.062678697 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS)
Feb 27 17:09:34 compute-0 python3.9[212903]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:34 compute-0 sudo[212894]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:35 compute-0 sudo[213067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujwwxpxqldfoyvrpihywgpbbaegnwhas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212175.022938-1406-230548092128633/AnsiballZ_stat.py'
Feb 27 17:09:35 compute-0 sudo[213067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:35 compute-0 python3.9[213070]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:09:35 compute-0 sudo[213067]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:35 compute-0 sudo[213146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lycacxelcpzoikddorwukptpaafvcbll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212175.022938-1406-230548092128633/AnsiballZ_file.py'
Feb 27 17:09:35 compute-0 sudo[213146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:36 compute-0 python3.9[213149]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:36 compute-0 sudo[213146]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:36 compute-0 sudo[213299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjnxlewocrqwqglrfcftjzxtdofnolik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212176.2658207-1418-90094599165729/AnsiballZ_stat.py'
Feb 27 17:09:36 compute-0 sudo[213299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:36 compute-0 python3.9[213302]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:09:36 compute-0 sudo[213299]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:37 compute-0 sudo[213378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxmzixvekknlifgdtvimazbrltnqlvci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212176.2658207-1418-90094599165729/AnsiballZ_file.py'
Feb 27 17:09:37 compute-0 sudo[213378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:37 compute-0 python3.9[213381]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:37 compute-0 sudo[213378]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:37 compute-0 sudo[213531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpffnbcguhoyzngfaxoowkuhrckexmss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212177.535483-1430-180274795828752/AnsiballZ_stat.py'
Feb 27 17:09:37 compute-0 sudo[213531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:38 compute-0 python3.9[213534]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 27 17:09:38 compute-0 sudo[213531]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:38 compute-0 sudo[213657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfpuahhdgmfmcofljggexrtdskligklr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212177.535483-1430-180274795828752/AnsiballZ_copy.py'
Feb 27 17:09:38 compute-0 sudo[213657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:38 compute-0 python3.9[213660]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1772212177.535483-1430-180274795828752/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:38 compute-0 sudo[213657]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:39 compute-0 sudo[213810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovqfelvgfiaprhckjadsfoidfvglkvxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212178.9331405-1445-82936792309153/AnsiballZ_file.py'
Feb 27 17:09:39 compute-0 sudo[213810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:39 compute-0 python3.9[213813]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:39 compute-0 sudo[213810]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:39 compute-0 sudo[213974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdvytbozohlgasjdipgjxoirtrtibtrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212179.6663182-1453-274659952111188/AnsiballZ_command.py'
Feb 27 17:09:39 compute-0 sudo[213974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:40 compute-0 podman[213937]: 2026-02-27 17:09:40.003423871 +0000 UTC m=+0.074417701 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:09:40 compute-0 python3.9[213983]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:09:40 compute-0 sudo[213974]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:40 compute-0 sudo[214143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnqitybiwsdjnhmvpkitmztmbxpdjayo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212180.4094455-1461-219633026123239/AnsiballZ_blockinfile.py'
Feb 27 17:09:40 compute-0 sudo[214143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:41 compute-0 python3.9[214146]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:41 compute-0 sudo[214143]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:41 compute-0 sudo[214296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trxoqrhqqofknzjqfmtkrkarkxkiarsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212181.4098368-1470-12715895348905/AnsiballZ_command.py'
Feb 27 17:09:41 compute-0 sudo[214296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:41 compute-0 python3.9[214299]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:09:41 compute-0 sudo[214296]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:42 compute-0 sudo[214450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gaopzlxajsgxxsompejvxwcghicsuscg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212182.0600348-1478-265299291204485/AnsiballZ_stat.py'
Feb 27 17:09:42 compute-0 sudo[214450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:42 compute-0 python3.9[214453]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 27 17:09:42 compute-0 sudo[214450]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:43 compute-0 sudo[214605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mkowystmlpadtjmtwwvxsyrhhxuftunb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212182.7769988-1486-179675516653769/AnsiballZ_command.py'
Feb 27 17:09:43 compute-0 sudo[214605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:43 compute-0 python3.9[214608]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 27 17:09:43 compute-0 sudo[214605]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:43 compute-0 sudo[214761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujummgutyovmcbipdaxdjzmvvfujaqma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1772212183.4458559-1494-159665715009015/AnsiballZ_file.py'
Feb 27 17:09:43 compute-0 sudo[214761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:09:43 compute-0 python3.9[214764]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 27 17:09:43 compute-0 sudo[214761]: pam_unix(sudo:session): session closed for user root
Feb 27 17:09:44 compute-0 sshd-session[187142]: Connection closed by 192.168.122.30 port 42900
Feb 27 17:09:44 compute-0 sshd-session[187139]: pam_unix(sshd:session): session closed for user zuul
Feb 27 17:09:44 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Feb 27 17:09:44 compute-0 systemd[1]: session-25.scope: Consumed 1min 44.947s CPU time.
Feb 27 17:09:44 compute-0 systemd-logind[803]: Session 25 logged out. Waiting for processes to exit.
Feb 27 17:09:44 compute-0 systemd-logind[803]: Removed session 25.
Feb 27 17:09:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:09:47.082 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:09:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:09:47.083 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:09:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:09:47.083 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:09:51 compute-0 podman[214789]: 2026-02-27 17:09:51.648340353 +0000 UTC m=+0.054290338 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:09:54 compute-0 podman[214813]: 2026-02-27 17:09:54.668373874 +0000 UTC m=+0.067490457 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true)
Feb 27 17:09:57 compute-0 podman[214832]: 2026-02-27 17:09:57.705992365 +0000 UTC m=+0.111565498 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 27 17:10:00 compute-0 podman[214859]: 2026-02-27 17:10:00.669380361 +0000 UTC m=+0.072967034 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, architecture=x86_64, release=1770267347, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible)
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.046 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.046 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.265 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:10:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:10:05 compute-0 podman[214879]: 2026-02-27 17:10:05.68820455 +0000 UTC m=+0.082012380 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.716 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.716 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.717 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.717 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.748 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.749 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.749 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.750 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.950 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.952 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5895MB free_disk=73.23164749145508GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.953 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:10:05 compute-0 nova_compute[186840]: 2026-02-27 17:10:05.953 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:10:06 compute-0 nova_compute[186840]: 2026-02-27 17:10:06.063 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:10:06 compute-0 nova_compute[186840]: 2026-02-27 17:10:06.063 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:10:06 compute-0 nova_compute[186840]: 2026-02-27 17:10:06.093 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:10:06 compute-0 nova_compute[186840]: 2026-02-27 17:10:06.124 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:10:06 compute-0 nova_compute[186840]: 2026-02-27 17:10:06.125 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:10:06 compute-0 nova_compute[186840]: 2026-02-27 17:10:06.126 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:10:07 compute-0 nova_compute[186840]: 2026-02-27 17:10:07.107 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:10:07 compute-0 nova_compute[186840]: 2026-02-27 17:10:07.108 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:10:07 compute-0 nova_compute[186840]: 2026-02-27 17:10:07.108 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:10:07 compute-0 nova_compute[186840]: 2026-02-27 17:10:07.109 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:10:09 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:10:09.376 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:10:09 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:10:09.377 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:10:09 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:10:09.379 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:10:10 compute-0 podman[214901]: 2026-02-27 17:10:10.664514655 +0000 UTC m=+0.066395229 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:10:22 compute-0 podman[214925]: 2026-02-27 17:10:22.668604662 +0000 UTC m=+0.064014461 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 27 17:10:25 compute-0 podman[214950]: 2026-02-27 17:10:25.669305129 +0000 UTC m=+0.073434174 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0)
Feb 27 17:10:28 compute-0 podman[214969]: 2026-02-27 17:10:28.713089446 +0000 UTC m=+0.107784622 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_managed=true)
Feb 27 17:10:31 compute-0 podman[214996]: 2026-02-27 17:10:31.708163192 +0000 UTC m=+0.105605508 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2026-02-05T04:57:10Z, release=1770267347, io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.7, distribution-scope=public, name=ubi9/ubi-minimal, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 27 17:10:36 compute-0 podman[215018]: 2026-02-27 17:10:36.640399225 +0000 UTC m=+0.051027597 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_managed=true)
Feb 27 17:10:41 compute-0 podman[215038]: 2026-02-27 17:10:41.673366256 +0000 UTC m=+0.081609015 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:10:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:10:47.083 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:10:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:10:47.084 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:10:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:10:47.084 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:10:53 compute-0 podman[215063]: 2026-02-27 17:10:53.676587625 +0000 UTC m=+0.076776266 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 27 17:10:56 compute-0 podman[215087]: 2026-02-27 17:10:56.662835567 +0000 UTC m=+0.068517974 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 27 17:10:59 compute-0 podman[215106]: 2026-02-27 17:10:59.701800877 +0000 UTC m=+0.101743125 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 27 17:11:02 compute-0 podman[215134]: 2026-02-27 17:11:02.668626335 +0000 UTC m=+0.070450551 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, managed_by=edpm_ansible, vcs-type=git, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.7, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 27 17:11:03 compute-0 nova_compute[186840]: 2026-02-27 17:11:03.696 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:11:05 compute-0 nova_compute[186840]: 2026-02-27 17:11:05.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:11:05 compute-0 nova_compute[186840]: 2026-02-27 17:11:05.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:11:05 compute-0 nova_compute[186840]: 2026-02-27 17:11:05.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:11:05 compute-0 nova_compute[186840]: 2026-02-27 17:11:05.771 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:11:05 compute-0 nova_compute[186840]: 2026-02-27 17:11:05.772 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:11:05 compute-0 nova_compute[186840]: 2026-02-27 17:11:05.772 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:11:05 compute-0 nova_compute[186840]: 2026-02-27 17:11:05.772 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:11:05 compute-0 nova_compute[186840]: 2026-02-27 17:11:05.970 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:11:05 compute-0 nova_compute[186840]: 2026-02-27 17:11:05.971 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6016MB free_disk=73.23165512084961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:11:05 compute-0 nova_compute[186840]: 2026-02-27 17:11:05.971 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:11:05 compute-0 nova_compute[186840]: 2026-02-27 17:11:05.972 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:11:06 compute-0 nova_compute[186840]: 2026-02-27 17:11:06.061 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:11:06 compute-0 nova_compute[186840]: 2026-02-27 17:11:06.061 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:11:06 compute-0 nova_compute[186840]: 2026-02-27 17:11:06.091 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:11:06 compute-0 nova_compute[186840]: 2026-02-27 17:11:06.124 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:11:06 compute-0 nova_compute[186840]: 2026-02-27 17:11:06.127 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:11:06 compute-0 nova_compute[186840]: 2026-02-27 17:11:06.127 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:11:07 compute-0 nova_compute[186840]: 2026-02-27 17:11:07.126 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:11:07 compute-0 nova_compute[186840]: 2026-02-27 17:11:07.127 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:11:07 compute-0 nova_compute[186840]: 2026-02-27 17:11:07.127 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:11:07 compute-0 nova_compute[186840]: 2026-02-27 17:11:07.147 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:11:07 compute-0 nova_compute[186840]: 2026-02-27 17:11:07.147 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:11:07 compute-0 podman[215155]: 2026-02-27 17:11:07.68950029 +0000 UTC m=+0.089728101 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute)
Feb 27 17:11:07 compute-0 nova_compute[186840]: 2026-02-27 17:11:07.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:11:07 compute-0 nova_compute[186840]: 2026-02-27 17:11:07.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:11:07 compute-0 nova_compute[186840]: 2026-02-27 17:11:07.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:11:07 compute-0 nova_compute[186840]: 2026-02-27 17:11:07.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:11:07 compute-0 nova_compute[186840]: 2026-02-27 17:11:07.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:11:12 compute-0 podman[215175]: 2026-02-27 17:11:12.668510053 +0000 UTC m=+0.072915091 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:11:24 compute-0 podman[215200]: 2026-02-27 17:11:24.668495673 +0000 UTC m=+0.075100374 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 27 17:11:27 compute-0 podman[215225]: 2026-02-27 17:11:27.66761075 +0000 UTC m=+0.068486863 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 27 17:11:30 compute-0 podman[215244]: 2026-02-27 17:11:30.698163615 +0000 UTC m=+0.105634737 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260223, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0)
Feb 27 17:11:33 compute-0 podman[215271]: 2026-02-27 17:11:33.668850242 +0000 UTC m=+0.068149180 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, version=9.7, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 27 17:11:38 compute-0 podman[215292]: 2026-02-27 17:11:38.677708383 +0000 UTC m=+0.079890111 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 27 17:11:43 compute-0 podman[215312]: 2026-02-27 17:11:43.646655657 +0000 UTC m=+0.054944065 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:11:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:11:47.085 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:11:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:11:47.086 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:11:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:11:47.086 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:11:55 compute-0 podman[215335]: 2026-02-27 17:11:55.645403502 +0000 UTC m=+0.053469330 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 27 17:11:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:11:58.350 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:11:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:11:58.351 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:11:58 compute-0 podman[215360]: 2026-02-27 17:11:58.649119589 +0000 UTC m=+0.055636351 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:12:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:01.353 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:12:01 compute-0 podman[215379]: 2026-02-27 17:12:01.69860175 +0000 UTC m=+0.102246676 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.43.0)
Feb 27 17:12:02 compute-0 nova_compute[186840]: 2026-02-27 17:12:02.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:02 compute-0 nova_compute[186840]: 2026-02-27 17:12:02.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 27 17:12:02 compute-0 nova_compute[186840]: 2026-02-27 17:12:02.727 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 27 17:12:02 compute-0 nova_compute[186840]: 2026-02-27 17:12:02.728 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:02 compute-0 nova_compute[186840]: 2026-02-27 17:12:02.728 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 27 17:12:02 compute-0 nova_compute[186840]: 2026-02-27 17:12:02.760 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:04 compute-0 podman[215405]: 2026-02-27 17:12:04.668853387 +0000 UTC m=+0.067658159 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, managed_by=edpm_ansible, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.265 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.265 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.265 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.265 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:12:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:12:06 compute-0 nova_compute[186840]: 2026-02-27 17:12:06.790 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:06 compute-0 nova_compute[186840]: 2026-02-27 17:12:06.791 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:06 compute-0 nova_compute[186840]: 2026-02-27 17:12:06.791 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:06 compute-0 nova_compute[186840]: 2026-02-27 17:12:06.825 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:06 compute-0 nova_compute[186840]: 2026-02-27 17:12:06.825 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:06 compute-0 nova_compute[186840]: 2026-02-27 17:12:06.826 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:06 compute-0 nova_compute[186840]: 2026-02-27 17:12:06.826 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:12:06 compute-0 nova_compute[186840]: 2026-02-27 17:12:06.972 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:12:06 compute-0 nova_compute[186840]: 2026-02-27 17:12:06.973 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6067MB free_disk=73.23336029052734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:12:06 compute-0 nova_compute[186840]: 2026-02-27 17:12:06.973 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:06 compute-0 nova_compute[186840]: 2026-02-27 17:12:06.973 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:07 compute-0 rsyslogd[1012]: imjournal: 1506 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 27 17:12:07 compute-0 nova_compute[186840]: 2026-02-27 17:12:07.285 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:12:07 compute-0 nova_compute[186840]: 2026-02-27 17:12:07.286 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:12:07 compute-0 nova_compute[186840]: 2026-02-27 17:12:07.349 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Refreshing inventories for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 27 17:12:07 compute-0 nova_compute[186840]: 2026-02-27 17:12:07.428 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Updating ProviderTree inventory for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 27 17:12:07 compute-0 nova_compute[186840]: 2026-02-27 17:12:07.428 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Updating inventory in ProviderTree for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 27 17:12:07 compute-0 nova_compute[186840]: 2026-02-27 17:12:07.449 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Refreshing aggregate associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 27 17:12:07 compute-0 nova_compute[186840]: 2026-02-27 17:12:07.481 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Refreshing trait associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, traits: HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AESNI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_BMI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 27 17:12:07 compute-0 nova_compute[186840]: 2026-02-27 17:12:07.505 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:12:07 compute-0 nova_compute[186840]: 2026-02-27 17:12:07.528 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:12:07 compute-0 nova_compute[186840]: 2026-02-27 17:12:07.531 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:12:07 compute-0 nova_compute[186840]: 2026-02-27 17:12:07.532 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.558s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:08 compute-0 nova_compute[186840]: 2026-02-27 17:12:08.436 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:08 compute-0 nova_compute[186840]: 2026-02-27 17:12:08.437 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:08 compute-0 nova_compute[186840]: 2026-02-27 17:12:08.437 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:08 compute-0 nova_compute[186840]: 2026-02-27 17:12:08.438 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:12:08 compute-0 nova_compute[186840]: 2026-02-27 17:12:08.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:08 compute-0 nova_compute[186840]: 2026-02-27 17:12:08.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:12:08 compute-0 nova_compute[186840]: 2026-02-27 17:12:08.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:12:08 compute-0 nova_compute[186840]: 2026-02-27 17:12:08.724 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:12:08 compute-0 nova_compute[186840]: 2026-02-27 17:12:08.724 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:09 compute-0 nova_compute[186840]: 2026-02-27 17:12:09.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:12:09 compute-0 podman[215427]: 2026-02-27 17:12:09.711825893 +0000 UTC m=+0.119474038 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:12:14 compute-0 podman[215447]: 2026-02-27 17:12:14.700388198 +0000 UTC m=+0.105431492 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.216 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.217 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.252 186844 DEBUG nova.compute.manager [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.446 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.448 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.459 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.460 186844 INFO nova.compute.claims [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.764 186844 DEBUG nova.compute.provider_tree [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.778 186844 DEBUG nova.scheduler.client.report [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.829 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.830 186844 DEBUG nova.compute.manager [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.880 186844 DEBUG nova.compute.manager [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.881 186844 DEBUG nova.network.neutron [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.915 186844 INFO nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:12:25 compute-0 nova_compute[186840]: 2026-02-27 17:12:25.955 186844 DEBUG nova.compute.manager [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:12:26 compute-0 nova_compute[186840]: 2026-02-27 17:12:26.061 186844 DEBUG nova.compute.manager [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:12:26 compute-0 nova_compute[186840]: 2026-02-27 17:12:26.063 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:12:26 compute-0 nova_compute[186840]: 2026-02-27 17:12:26.063 186844 INFO nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Creating image(s)
Feb 27 17:12:26 compute-0 nova_compute[186840]: 2026-02-27 17:12:26.064 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:26 compute-0 nova_compute[186840]: 2026-02-27 17:12:26.065 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:26 compute-0 nova_compute[186840]: 2026-02-27 17:12:26.066 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:26 compute-0 nova_compute[186840]: 2026-02-27 17:12:26.066 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:26 compute-0 nova_compute[186840]: 2026-02-27 17:12:26.067 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:26 compute-0 nova_compute[186840]: 2026-02-27 17:12:26.455 186844 WARNING oslo_policy.policy [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 27 17:12:26 compute-0 nova_compute[186840]: 2026-02-27 17:12:26.456 186844 WARNING oslo_policy.policy [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 27 17:12:26 compute-0 nova_compute[186840]: 2026-02-27 17:12:26.461 186844 DEBUG nova.policy [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:12:26 compute-0 podman[215471]: 2026-02-27 17:12:26.668215311 +0000 UTC m=+0.068211322 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 27 17:12:28 compute-0 nova_compute[186840]: 2026-02-27 17:12:28.315 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:12:28 compute-0 nova_compute[186840]: 2026-02-27 17:12:28.381 186844 DEBUG nova.network.neutron [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Successfully created port: 230aa4f7-60f7-415f-98c3-f586b1551f43 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:12:28 compute-0 nova_compute[186840]: 2026-02-27 17:12:28.393 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d.part --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:12:28 compute-0 nova_compute[186840]: 2026-02-27 17:12:28.394 186844 DEBUG nova.virt.images [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] b49463d5-90a4-4c27-9dac-a140f152eabc was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Feb 27 17:12:28 compute-0 nova_compute[186840]: 2026-02-27 17:12:28.396 186844 DEBUG nova.privsep.utils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 27 17:12:28 compute-0 nova_compute[186840]: 2026-02-27 17:12:28.396 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d.part /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:12:28 compute-0 nova_compute[186840]: 2026-02-27 17:12:28.571 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d.part /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d.converted" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:12:28 compute-0 nova_compute[186840]: 2026-02-27 17:12:28.574 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:12:28 compute-0 nova_compute[186840]: 2026-02-27 17:12:28.617 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d.converted --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:12:28 compute-0 nova_compute[186840]: 2026-02-27 17:12:28.618 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.551s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:28 compute-0 nova_compute[186840]: 2026-02-27 17:12:28.629 186844 INFO oslo.privsep.daemon [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp87aq4chf/privsep.sock']
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.226 186844 INFO oslo.privsep.daemon [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Spawned new privsep daemon via rootwrap
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.109 215513 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.115 215513 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.118 215513 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.119 215513 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215513
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.311 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.371 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.372 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.373 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.398 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.441 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.443 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.467 186844 DEBUG nova.network.neutron [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Successfully updated port: 230aa4f7-60f7-415f-98c3-f586b1551f43 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.482 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.484 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.484 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.499 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.500 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.500 186844 DEBUG nova.network.neutron [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.560 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.561 186844 DEBUG nova.virt.disk.api [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.562 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.614 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.615 186844 DEBUG nova.virt.disk.api [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.615 186844 DEBUG nova.objects.instance [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid 826adfc9-edc2-47cf-82ae-f8b79aebaa68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.634 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.634 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Ensure instance console log exists: /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.634 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.635 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.635 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:29 compute-0 podman[215528]: 2026-02-27 17:12:29.686867935 +0000 UTC m=+0.083493528 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 27 17:12:29 compute-0 nova_compute[186840]: 2026-02-27 17:12:29.697 186844 DEBUG nova.network.neutron [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.044 186844 DEBUG nova.compute.manager [req-148a8216-d36a-440a-8110-c1913529490d req-e27432df-2f4e-4e67-83ff-8c4a8a2ca7e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Received event network-changed-230aa4f7-60f7-415f-98c3-f586b1551f43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.044 186844 DEBUG nova.compute.manager [req-148a8216-d36a-440a-8110-c1913529490d req-e27432df-2f4e-4e67-83ff-8c4a8a2ca7e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Refreshing instance network info cache due to event network-changed-230aa4f7-60f7-415f-98c3-f586b1551f43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.045 186844 DEBUG oslo_concurrency.lockutils [req-148a8216-d36a-440a-8110-c1913529490d req-e27432df-2f4e-4e67-83ff-8c4a8a2ca7e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.587 186844 DEBUG nova.network.neutron [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Updating instance_info_cache with network_info: [{"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.612 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.612 186844 DEBUG nova.compute.manager [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Instance network_info: |[{"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.614 186844 DEBUG oslo_concurrency.lockutils [req-148a8216-d36a-440a-8110-c1913529490d req-e27432df-2f4e-4e67-83ff-8c4a8a2ca7e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.614 186844 DEBUG nova.network.neutron [req-148a8216-d36a-440a-8110-c1913529490d req-e27432df-2f4e-4e67-83ff-8c4a8a2ca7e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Refreshing network info cache for port 230aa4f7-60f7-415f-98c3-f586b1551f43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.622 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Start _get_guest_xml network_info=[{"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.631 186844 WARNING nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.640 186844 DEBUG nova.virt.libvirt.host [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.641 186844 DEBUG nova.virt.libvirt.host [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.648 186844 DEBUG nova.virt.libvirt.host [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.649 186844 DEBUG nova.virt.libvirt.host [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.650 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.651 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.652 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.652 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.652 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.653 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.653 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.654 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.654 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.655 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.655 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.655 186844 DEBUG nova.virt.hardware [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.661 186844 DEBUG nova.privsep.utils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.663 186844 DEBUG nova.virt.libvirt.vif [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:12:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1769989291',display_name='tempest-TestNetworkBasicOps-server-1769989291',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1769989291',id=1,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChDldA9eHt69UmmFDwvFUnEpXA7RKYuw4U1QoiTTfttm2GMj/uAfp8mL+79aV7KIdshvzwUkOP1mGNagvdQMwWrNtdrRQcXKglkmafYrQN13J3tRiiJ795KKNmbNnHXUQ==',key_name='tempest-TestNetworkBasicOps-1253152918',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-dty06rmr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:12:25Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=826adfc9-edc2-47cf-82ae-f8b79aebaa68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.664 186844 DEBUG nova.network.os_vif_util [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.665 186844 DEBUG nova.network.os_vif_util [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:2f:65,bridge_name='br-int',has_traffic_filtering=True,id=230aa4f7-60f7-415f-98c3-f586b1551f43,network=Network(8cc3eca5-483a-473a-bac8-5f86e54d4447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap230aa4f7-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.667 186844 DEBUG nova.objects.instance [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 826adfc9-edc2-47cf-82ae-f8b79aebaa68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.685 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:12:30 compute-0 nova_compute[186840]:   <uuid>826adfc9-edc2-47cf-82ae-f8b79aebaa68</uuid>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   <name>instance-00000001</name>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1769989291</nova:name>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:12:30</nova:creationTime>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:12:30 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:12:30 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:12:30 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:12:30 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:12:30 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:12:30 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:12:30 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:12:30 compute-0 nova_compute[186840]:         <nova:port uuid="230aa4f7-60f7-415f-98c3-f586b1551f43">
Feb 27 17:12:30 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <system>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <entry name="serial">826adfc9-edc2-47cf-82ae-f8b79aebaa68</entry>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <entry name="uuid">826adfc9-edc2-47cf-82ae-f8b79aebaa68</entry>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     </system>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   <os>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   </os>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   <features>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   </features>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk.config"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:53:2f:65"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <target dev="tap230aa4f7-60"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/console.log" append="off"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <video>
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     </video>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:12:30 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:12:30 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:12:30 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:12:30 compute-0 nova_compute[186840]: </domain>
Feb 27 17:12:30 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.687 186844 DEBUG nova.compute.manager [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Preparing to wait for external event network-vif-plugged-230aa4f7-60f7-415f-98c3-f586b1551f43 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.688 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.689 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.690 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.691 186844 DEBUG nova.virt.libvirt.vif [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:12:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1769989291',display_name='tempest-TestNetworkBasicOps-server-1769989291',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1769989291',id=1,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChDldA9eHt69UmmFDwvFUnEpXA7RKYuw4U1QoiTTfttm2GMj/uAfp8mL+79aV7KIdshvzwUkOP1mGNagvdQMwWrNtdrRQcXKglkmafYrQN13J3tRiiJ795KKNmbNnHXUQ==',key_name='tempest-TestNetworkBasicOps-1253152918',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-dty06rmr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:12:25Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=826adfc9-edc2-47cf-82ae-f8b79aebaa68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.691 186844 DEBUG nova.network.os_vif_util [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.692 186844 DEBUG nova.network.os_vif_util [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:2f:65,bridge_name='br-int',has_traffic_filtering=True,id=230aa4f7-60f7-415f-98c3-f586b1551f43,network=Network(8cc3eca5-483a-473a-bac8-5f86e54d4447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap230aa4f7-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.693 186844 DEBUG os_vif [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:2f:65,bridge_name='br-int',has_traffic_filtering=True,id=230aa4f7-60f7-415f-98c3-f586b1551f43,network=Network(8cc3eca5-483a-473a-bac8-5f86e54d4447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap230aa4f7-60') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.747 186844 DEBUG ovsdbapp.backend.ovs_idl [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.747 186844 DEBUG ovsdbapp.backend.ovs_idl [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.748 186844 DEBUG ovsdbapp.backend.ovs_idl [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.748 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.749 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.749 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.750 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.752 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.754 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.765 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.765 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.766 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:12:30 compute-0 nova_compute[186840]: 2026-02-27 17:12:30.766 186844 INFO oslo.privsep.daemon [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpakv9kxaz/privsep.sock']
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.415 186844 INFO oslo.privsep.daemon [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Spawned new privsep daemon via rootwrap
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.305 215552 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.310 215552 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.314 215552 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.314 215552 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215552
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.703 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.704 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap230aa4f7-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.705 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap230aa4f7-60, col_values=(('external_ids', {'iface-id': '230aa4f7-60f7-415f-98c3-f586b1551f43', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:2f:65', 'vm-uuid': '826adfc9-edc2-47cf-82ae-f8b79aebaa68'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.707 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:31 compute-0 NetworkManager[56537]: <info>  [1772212351.7105] manager: (tap230aa4f7-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.711 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.716 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.718 186844 INFO os_vif [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:2f:65,bridge_name='br-int',has_traffic_filtering=True,id=230aa4f7-60f7-415f-98c3-f586b1551f43,network=Network(8cc3eca5-483a-473a-bac8-5f86e54d4447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap230aa4f7-60')
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.848 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.849 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.849 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:53:2f:65, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:12:31 compute-0 nova_compute[186840]: 2026-02-27 17:12:31.850 186844 INFO nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Using config drive
Feb 27 17:12:32 compute-0 nova_compute[186840]: 2026-02-27 17:12:32.651 186844 INFO nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Creating config drive at /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk.config
Feb 27 17:12:32 compute-0 nova_compute[186840]: 2026-02-27 17:12:32.657 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp_3fxcc0q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:12:32 compute-0 podman[215558]: 2026-02-27 17:12:32.708354897 +0000 UTC m=+0.104542480 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 27 17:12:32 compute-0 nova_compute[186840]: 2026-02-27 17:12:32.777 186844 DEBUG oslo_concurrency.processutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp_3fxcc0q" returned: 0 in 0.120s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:12:32 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 27 17:12:32 compute-0 NetworkManager[56537]: <info>  [1772212352.8549] manager: (tap230aa4f7-60): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Feb 27 17:12:32 compute-0 kernel: tap230aa4f7-60: entered promiscuous mode
Feb 27 17:12:32 compute-0 ovn_controller[96756]: 2026-02-27T17:12:32Z|00027|binding|INFO|Claiming lport 230aa4f7-60f7-415f-98c3-f586b1551f43 for this chassis.
Feb 27 17:12:32 compute-0 nova_compute[186840]: 2026-02-27 17:12:32.856 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:32 compute-0 ovn_controller[96756]: 2026-02-27T17:12:32Z|00028|binding|INFO|230aa4f7-60f7-415f-98c3-f586b1551f43: Claiming fa:16:3e:53:2f:65 10.100.0.6
Feb 27 17:12:32 compute-0 nova_compute[186840]: 2026-02-27 17:12:32.862 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:32.872 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:2f:65 10.100.0.6'], port_security=['fa:16:3e:53:2f:65 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '826adfc9-edc2-47cf-82ae-f8b79aebaa68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cc3eca5-483a-473a-bac8-5f86e54d4447', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1e09f223-d0ad-4562-8f86-7f3bcd96c7c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc7bb2cb-ff0b-4036-98ea-f69638f43f43, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=230aa4f7-60f7-415f-98c3-f586b1551f43) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:12:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:32.874 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 230aa4f7-60f7-415f-98c3-f586b1551f43 in datapath 8cc3eca5-483a-473a-bac8-5f86e54d4447 bound to our chassis
Feb 27 17:12:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:32.877 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8cc3eca5-483a-473a-bac8-5f86e54d4447
Feb 27 17:12:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:32.878 106085 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpwi9bs843/privsep.sock']
Feb 27 17:12:32 compute-0 nova_compute[186840]: 2026-02-27 17:12:32.895 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:32 compute-0 ovn_controller[96756]: 2026-02-27T17:12:32Z|00029|binding|INFO|Setting lport 230aa4f7-60f7-415f-98c3-f586b1551f43 ovn-installed in OVS
Feb 27 17:12:32 compute-0 ovn_controller[96756]: 2026-02-27T17:12:32Z|00030|binding|INFO|Setting lport 230aa4f7-60f7-415f-98c3-f586b1551f43 up in Southbound
Feb 27 17:12:32 compute-0 systemd-udevd[215607]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:12:32 compute-0 nova_compute[186840]: 2026-02-27 17:12:32.900 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:32 compute-0 nova_compute[186840]: 2026-02-27 17:12:32.908 186844 DEBUG nova.network.neutron [req-148a8216-d36a-440a-8110-c1913529490d req-e27432df-2f4e-4e67-83ff-8c4a8a2ca7e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Updated VIF entry in instance network info cache for port 230aa4f7-60f7-415f-98c3-f586b1551f43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:12:32 compute-0 nova_compute[186840]: 2026-02-27 17:12:32.909 186844 DEBUG nova.network.neutron [req-148a8216-d36a-440a-8110-c1913529490d req-e27432df-2f4e-4e67-83ff-8c4a8a2ca7e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Updating instance_info_cache with network_info: [{"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:12:32 compute-0 NetworkManager[56537]: <info>  [1772212352.9147] device (tap230aa4f7-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:12:32 compute-0 NetworkManager[56537]: <info>  [1772212352.9156] device (tap230aa4f7-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:12:32 compute-0 systemd-machined[156136]: New machine qemu-1-instance-00000001.
Feb 27 17:12:32 compute-0 nova_compute[186840]: 2026-02-27 17:12:32.933 186844 DEBUG oslo_concurrency.lockutils [req-148a8216-d36a-440a-8110-c1913529490d req-e27432df-2f4e-4e67-83ff-8c4a8a2ca7e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:12:32 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.127 186844 DEBUG nova.compute.manager [req-6018ffb1-3831-4c14-b263-fd72a81a23bd req-3664506a-2967-4864-860d-82c0cc823013 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Received event network-vif-plugged-230aa4f7-60f7-415f-98c3-f586b1551f43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.128 186844 DEBUG oslo_concurrency.lockutils [req-6018ffb1-3831-4c14-b263-fd72a81a23bd req-3664506a-2967-4864-860d-82c0cc823013 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.128 186844 DEBUG oslo_concurrency.lockutils [req-6018ffb1-3831-4c14-b263-fd72a81a23bd req-3664506a-2967-4864-860d-82c0cc823013 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.129 186844 DEBUG oslo_concurrency.lockutils [req-6018ffb1-3831-4c14-b263-fd72a81a23bd req-3664506a-2967-4864-860d-82c0cc823013 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.129 186844 DEBUG nova.compute.manager [req-6018ffb1-3831-4c14-b263-fd72a81a23bd req-3664506a-2967-4864-860d-82c0cc823013 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Processing event network-vif-plugged-230aa4f7-60f7-415f-98c3-f586b1551f43 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.273 186844 DEBUG nova.compute.manager [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.276 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212353.2731535, 826adfc9-edc2-47cf-82ae-f8b79aebaa68 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.276 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] VM Started (Lifecycle Event)
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.295 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.299 186844 INFO nova.virt.libvirt.driver [-] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Instance spawned successfully.
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.299 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.341 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.344 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.367 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.368 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212353.2793586, 826adfc9-edc2-47cf-82ae-f8b79aebaa68 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.368 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] VM Paused (Lifecycle Event)
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.373 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.373 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.374 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.374 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.375 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.375 186844 DEBUG nova.virt.libvirt.driver [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.386 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.401 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212353.2795334, 826adfc9-edc2-47cf-82ae-f8b79aebaa68 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.402 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] VM Resumed (Lifecycle Event)
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.421 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.424 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.447 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.466 186844 INFO nova.compute.manager [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Took 7.40 seconds to spawn the instance on the hypervisor.
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.467 186844 DEBUG nova.compute.manager [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:12:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:33.484 106085 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 27 17:12:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:33.485 106085 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpwi9bs843/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 27 17:12:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:33.375 215632 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 27 17:12:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:33.379 215632 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 27 17:12:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:33.382 215632 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 27 17:12:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:33.382 215632 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215632
Feb 27 17:12:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:33.488 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[f8ed052e-e0a3-43d1-aa1c-4316fa9abcc3]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.531 186844 INFO nova.compute.manager [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Took 8.14 seconds to build instance.
Feb 27 17:12:33 compute-0 nova_compute[186840]: 2026-02-27 17:12:33.548 186844 DEBUG oslo_concurrency.lockutils [None req-38ef29e8-117c-49a0-b6fd-2980563af532 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.330s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:33.971 215632 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:33.971 215632 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:33.971 215632 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:34.460 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[91cd2106-3006-4531-95fe-13f1dbb51314]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:34.461 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8cc3eca5-41 in ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:12:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:34.464 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8cc3eca5-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:12:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:34.464 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[728ed89b-85e3-4f8a-b7f5-d7538ae9fb4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:34.466 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[8581a4c5-2019-4054-b2b2-3a6889ef248b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:34.506 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[ce71d2ed-e7d4-4fd2-8180-7871d81fe479]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:34.533 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[bdb7219e-25e3-462f-ae17-ca2e14eb73e0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:34.535 106085 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp1wtn7fk9/privsep.sock']
Feb 27 17:12:35 compute-0 nova_compute[186840]: 2026-02-27 17:12:35.096 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:35.184 106085 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 27 17:12:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:35.185 106085 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp1wtn7fk9/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 27 17:12:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:35.048 215646 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 27 17:12:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:35.052 215646 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 27 17:12:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:35.054 215646 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 27 17:12:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:35.054 215646 INFO oslo.privsep.daemon [-] privsep daemon running as pid 215646
Feb 27 17:12:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:35.188 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[e777745a-7feb-4c8f-a046-c4e85fb86f49]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:35 compute-0 nova_compute[186840]: 2026-02-27 17:12:35.243 186844 DEBUG nova.compute.manager [req-75091251-3cc0-433b-bb68-c821bfce770a req-ee2095f6-49d7-488d-8bf3-827f4d381640 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Received event network-vif-plugged-230aa4f7-60f7-415f-98c3-f586b1551f43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:12:35 compute-0 nova_compute[186840]: 2026-02-27 17:12:35.244 186844 DEBUG oslo_concurrency.lockutils [req-75091251-3cc0-433b-bb68-c821bfce770a req-ee2095f6-49d7-488d-8bf3-827f4d381640 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:35 compute-0 nova_compute[186840]: 2026-02-27 17:12:35.244 186844 DEBUG oslo_concurrency.lockutils [req-75091251-3cc0-433b-bb68-c821bfce770a req-ee2095f6-49d7-488d-8bf3-827f4d381640 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:35 compute-0 nova_compute[186840]: 2026-02-27 17:12:35.245 186844 DEBUG oslo_concurrency.lockutils [req-75091251-3cc0-433b-bb68-c821bfce770a req-ee2095f6-49d7-488d-8bf3-827f4d381640 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:35 compute-0 nova_compute[186840]: 2026-02-27 17:12:35.245 186844 DEBUG nova.compute.manager [req-75091251-3cc0-433b-bb68-c821bfce770a req-ee2095f6-49d7-488d-8bf3-827f4d381640 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] No waiting events found dispatching network-vif-plugged-230aa4f7-60f7-415f-98c3-f586b1551f43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:12:35 compute-0 nova_compute[186840]: 2026-02-27 17:12:35.246 186844 WARNING nova.compute.manager [req-75091251-3cc0-433b-bb68-c821bfce770a req-ee2095f6-49d7-488d-8bf3-827f4d381640 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Received unexpected event network-vif-plugged-230aa4f7-60f7-415f-98c3-f586b1551f43 for instance with vm_state active and task_state None.
Feb 27 17:12:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:35.665 215646 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:35.665 215646 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:35.665 215646 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:35 compute-0 podman[215651]: 2026-02-27 17:12:35.671937732 +0000 UTC m=+0.071135058 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, version=9.7, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.165 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[974b8fdc-c1ef-41d0-bd46-8a3bba00ea14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.189 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[4db01420-2e90-42f9-b7a0-c30d14a61617]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:36 compute-0 NetworkManager[56537]: <info>  [1772212356.1918] manager: (tap8cc3eca5-40): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.213 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[1c3edaf8-c463-4ed2-9798-9b85797784bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.216 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[807f2f2d-9294-4609-a2cb-c6f9d25d2eed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:36 compute-0 systemd-udevd[215681]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:12:36 compute-0 NetworkManager[56537]: <info>  [1772212356.2422] device (tap8cc3eca5-40): carrier: link connected
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.246 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[95167ecf-c29e-433e-9fd3-4f3c4ccf0219]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.264 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9e27ed-fffe-40cb-94cd-6dceb51cbace]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cc3eca5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:d8:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323173, 'reachable_time': 16905, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215698, 'error': None, 'target': 'ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.276 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[bc27771f-82f0-4dba-8064-9cb70e30a1f1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe41:d808'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 323173, 'tstamp': 323173}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215699, 'error': None, 'target': 'ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.289 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[0210602e-28a0-4329-9133-9201dac5a265]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8cc3eca5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:d8:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323173, 'reachable_time': 16905, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215700, 'error': None, 'target': 'ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.308 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[d8275b57-0d73-4b9a-9ab6-fd098784180b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.342 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[189c9b76-a958-406b-b87a-1c7162ef1487]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.343 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cc3eca5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.344 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.344 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8cc3eca5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:12:36 compute-0 nova_compute[186840]: 2026-02-27 17:12:36.345 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:36 compute-0 kernel: tap8cc3eca5-40: entered promiscuous mode
Feb 27 17:12:36 compute-0 NetworkManager[56537]: <info>  [1772212356.3477] manager: (tap8cc3eca5-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Feb 27 17:12:36 compute-0 nova_compute[186840]: 2026-02-27 17:12:36.350 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.351 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8cc3eca5-40, col_values=(('external_ids', {'iface-id': 'f65af9bf-2e56-4266-8f83-fdf23f039108'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:12:36 compute-0 nova_compute[186840]: 2026-02-27 17:12:36.352 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:36 compute-0 ovn_controller[96756]: 2026-02-27T17:12:36Z|00031|binding|INFO|Releasing lport f65af9bf-2e56-4266-8f83-fdf23f039108 from this chassis (sb_readonly=0)
Feb 27 17:12:36 compute-0 nova_compute[186840]: 2026-02-27 17:12:36.358 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.359 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8cc3eca5-483a-473a-bac8-5f86e54d4447.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8cc3eca5-483a-473a-bac8-5f86e54d4447.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.360 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[c87f454e-fd7b-46f6-9416-b32fa065f844]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.361 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-8cc3eca5-483a-473a-bac8-5f86e54d4447
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/8cc3eca5-483a-473a-bac8-5f86e54d4447.pid.haproxy
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID 8cc3eca5-483a-473a-bac8-5f86e54d4447
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:12:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:36.362 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447', 'env', 'PROCESS_TAG=haproxy-8cc3eca5-483a-473a-bac8-5f86e54d4447', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8cc3eca5-483a-473a-bac8-5f86e54d4447.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:12:36 compute-0 podman[215733]: 2026-02-27 17:12:36.709533352 +0000 UTC m=+0.065206896 container create 10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 27 17:12:36 compute-0 nova_compute[186840]: 2026-02-27 17:12:36.708 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:36 compute-0 systemd[1]: Started libpod-conmon-10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c.scope.
Feb 27 17:12:36 compute-0 podman[215733]: 2026-02-27 17:12:36.674335602 +0000 UTC m=+0.030009186 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:12:36 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:12:36 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eee797cadf0e20cde99125dcf10a7401e7b5a4c1b5464b43abbbd8897e6d162f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:12:36 compute-0 podman[215733]: 2026-02-27 17:12:36.793785596 +0000 UTC m=+0.149459170 container init 10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS)
Feb 27 17:12:36 compute-0 podman[215733]: 2026-02-27 17:12:36.79727351 +0000 UTC m=+0.152947064 container start 10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223)
Feb 27 17:12:36 compute-0 neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447[215748]: [NOTICE]   (215752) : New worker (215754) forked
Feb 27 17:12:36 compute-0 neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447[215748]: [NOTICE]   (215752) : Loading success.
Feb 27 17:12:39 compute-0 ovn_controller[96756]: 2026-02-27T17:12:39Z|00032|binding|INFO|Releasing lport f65af9bf-2e56-4266-8f83-fdf23f039108 from this chassis (sb_readonly=0)
Feb 27 17:12:39 compute-0 nova_compute[186840]: 2026-02-27 17:12:39.466 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:39 compute-0 NetworkManager[56537]: <info>  [1772212359.4710] manager: (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/23)
Feb 27 17:12:39 compute-0 NetworkManager[56537]: <info>  [1772212359.4722] device (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 17:12:39 compute-0 NetworkManager[56537]: <warn>  [1772212359.4725] device (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 27 17:12:39 compute-0 NetworkManager[56537]: <info>  [1772212359.4745] manager: (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/24)
Feb 27 17:12:39 compute-0 NetworkManager[56537]: <info>  [1772212359.4753] device (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Feb 27 17:12:39 compute-0 NetworkManager[56537]: <warn>  [1772212359.4754] device (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Feb 27 17:12:39 compute-0 NetworkManager[56537]: <info>  [1772212359.4772] manager: (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Feb 27 17:12:39 compute-0 NetworkManager[56537]: <info>  [1772212359.4843] manager: (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Feb 27 17:12:39 compute-0 NetworkManager[56537]: <info>  [1772212359.4851] device (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 27 17:12:39 compute-0 NetworkManager[56537]: <info>  [1772212359.4855] device (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Feb 27 17:12:39 compute-0 ovn_controller[96756]: 2026-02-27T17:12:39Z|00033|binding|INFO|Releasing lport f65af9bf-2e56-4266-8f83-fdf23f039108 from this chassis (sb_readonly=0)
Feb 27 17:12:39 compute-0 nova_compute[186840]: 2026-02-27 17:12:39.487 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:39 compute-0 nova_compute[186840]: 2026-02-27 17:12:39.493 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:40 compute-0 nova_compute[186840]: 2026-02-27 17:12:40.125 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:40 compute-0 nova_compute[186840]: 2026-02-27 17:12:40.326 186844 DEBUG nova.compute.manager [req-df6b0867-2e28-4ece-9c05-41e21715afcd req-4545f1c7-1917-4589-8d47-aa197451e2d7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Received event network-changed-230aa4f7-60f7-415f-98c3-f586b1551f43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:12:40 compute-0 nova_compute[186840]: 2026-02-27 17:12:40.327 186844 DEBUG nova.compute.manager [req-df6b0867-2e28-4ece-9c05-41e21715afcd req-4545f1c7-1917-4589-8d47-aa197451e2d7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Refreshing instance network info cache due to event network-changed-230aa4f7-60f7-415f-98c3-f586b1551f43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:12:40 compute-0 nova_compute[186840]: 2026-02-27 17:12:40.328 186844 DEBUG oslo_concurrency.lockutils [req-df6b0867-2e28-4ece-9c05-41e21715afcd req-4545f1c7-1917-4589-8d47-aa197451e2d7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:12:40 compute-0 nova_compute[186840]: 2026-02-27 17:12:40.328 186844 DEBUG oslo_concurrency.lockutils [req-df6b0867-2e28-4ece-9c05-41e21715afcd req-4545f1c7-1917-4589-8d47-aa197451e2d7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:12:40 compute-0 nova_compute[186840]: 2026-02-27 17:12:40.329 186844 DEBUG nova.network.neutron [req-df6b0867-2e28-4ece-9c05-41e21715afcd req-4545f1c7-1917-4589-8d47-aa197451e2d7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Refreshing network info cache for port 230aa4f7-60f7-415f-98c3-f586b1551f43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:12:40 compute-0 podman[215764]: 2026-02-27 17:12:40.662447415 +0000 UTC m=+0.062278444 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Feb 27 17:12:41 compute-0 nova_compute[186840]: 2026-02-27 17:12:41.640 186844 DEBUG nova.network.neutron [req-df6b0867-2e28-4ece-9c05-41e21715afcd req-4545f1c7-1917-4589-8d47-aa197451e2d7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Updated VIF entry in instance network info cache for port 230aa4f7-60f7-415f-98c3-f586b1551f43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:12:41 compute-0 nova_compute[186840]: 2026-02-27 17:12:41.641 186844 DEBUG nova.network.neutron [req-df6b0867-2e28-4ece-9c05-41e21715afcd req-4545f1c7-1917-4589-8d47-aa197451e2d7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Updating instance_info_cache with network_info: [{"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:12:41 compute-0 nova_compute[186840]: 2026-02-27 17:12:41.674 186844 DEBUG oslo_concurrency.lockutils [req-df6b0867-2e28-4ece-9c05-41e21715afcd req-4545f1c7-1917-4589-8d47-aa197451e2d7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:12:41 compute-0 nova_compute[186840]: 2026-02-27 17:12:41.716 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:44 compute-0 ovn_controller[96756]: 2026-02-27T17:12:44Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:53:2f:65 10.100.0.6
Feb 27 17:12:44 compute-0 ovn_controller[96756]: 2026-02-27T17:12:44Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:53:2f:65 10.100.0.6
Feb 27 17:12:45 compute-0 nova_compute[186840]: 2026-02-27 17:12:45.126 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:45 compute-0 podman[215797]: 2026-02-27 17:12:45.692216398 +0000 UTC m=+0.090109177 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:12:46 compute-0 nova_compute[186840]: 2026-02-27 17:12:46.719 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:47.087 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:12:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:47.088 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:12:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:12:47.088 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:12:50 compute-0 nova_compute[186840]: 2026-02-27 17:12:50.128 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:50 compute-0 nova_compute[186840]: 2026-02-27 17:12:50.685 186844 INFO nova.compute.manager [None req-8b8200dd-8a88-4941-82c8-b0e1a8f14730 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Get console output
Feb 27 17:12:50 compute-0 nova_compute[186840]: 2026-02-27 17:12:50.824 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:12:51 compute-0 nova_compute[186840]: 2026-02-27 17:12:51.720 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:55 compute-0 nova_compute[186840]: 2026-02-27 17:12:55.129 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:56 compute-0 nova_compute[186840]: 2026-02-27 17:12:56.722 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:12:57 compute-0 podman[215821]: 2026-02-27 17:12:57.673555082 +0000 UTC m=+0.075881413 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:13:00 compute-0 nova_compute[186840]: 2026-02-27 17:13:00.133 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:00 compute-0 nova_compute[186840]: 2026-02-27 17:13:00.273 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:00 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:00.274 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:13:00 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:00.275 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:13:00 compute-0 podman[215845]: 2026-02-27 17:13:00.635180144 +0000 UTC m=+0.047680973 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 27 17:13:01 compute-0 nova_compute[186840]: 2026-02-27 17:13:01.726 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:03 compute-0 nova_compute[186840]: 2026-02-27 17:13:03.694 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:13:03 compute-0 podman[215864]: 2026-02-27 17:13:03.73039982 +0000 UTC m=+0.134873758 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:13:05 compute-0 nova_compute[186840]: 2026-02-27 17:13:05.156 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.393 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "7907c492-fa28-419d-94ba-339df849c7be" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.393 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.412 186844 DEBUG nova.compute.manager [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.557 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.558 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.571 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.571 186844 INFO nova.compute.claims [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:13:06 compute-0 podman[215891]: 2026-02-27 17:13:06.695928106 +0000 UTC m=+0.101006950 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1770267347, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.728 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.753 186844 DEBUG nova.compute.provider_tree [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Updating inventory in ProviderTree for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.796 186844 ERROR nova.scheduler.client.report [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [req-5141bc2a-ced0-4d45-ab94-8fd35e4b05c0] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 2b4df47a-58ba-41db-b94b-eb594c2f9699.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-5141bc2a-ced0-4d45-ab94-8fd35e4b05c0"}]}
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.810 186844 DEBUG nova.scheduler.client.report [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Refreshing inventories for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.829 186844 DEBUG nova.scheduler.client.report [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Updating ProviderTree inventory for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.830 186844 DEBUG nova.compute.provider_tree [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Updating inventory in ProviderTree for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.850 186844 DEBUG nova.scheduler.client.report [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Refreshing aggregate associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.869 186844 DEBUG nova.scheduler.client.report [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Refreshing trait associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, traits: HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AESNI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_BMI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 27 17:13:06 compute-0 nova_compute[186840]: 2026-02-27 17:13:06.956 186844 DEBUG nova.compute.provider_tree [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Updating inventory in ProviderTree for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.025 186844 DEBUG nova.scheduler.client.report [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Updated inventory for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.025 186844 DEBUG nova.compute.provider_tree [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Updating resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.026 186844 DEBUG nova.compute.provider_tree [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Updating inventory in ProviderTree for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.058 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.500s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.059 186844 DEBUG nova.compute.manager [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.108 186844 DEBUG nova.compute.manager [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.109 186844 DEBUG nova.network.neutron [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.150 186844 INFO nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.198 186844 DEBUG nova.compute.manager [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.332 186844 DEBUG nova.compute.manager [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.335 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.335 186844 INFO nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Creating image(s)
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.336 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.337 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.338 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.366 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.382 186844 DEBUG nova.policy [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.419 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.421 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.422 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.442 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.488 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.489 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.532 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.533 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.534 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.590 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.592 186844 DEBUG nova.virt.disk.api [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.593 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.665 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.666 186844 DEBUG nova.virt.disk.api [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.667 186844 DEBUG nova.objects.instance [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid 7907c492-fa28-419d-94ba-339df849c7be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.694 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.695 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Ensure instance console log exists: /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.695 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.696 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.696 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.923 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.924 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.924 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:07 compute-0 nova_compute[186840]: 2026-02-27 17:13:07.924 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.017 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.094 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.096 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.177 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.346 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.348 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5586MB free_disk=73.1658706665039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.348 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.348 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.426 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Instance 826adfc9-edc2-47cf-82ae-f8b79aebaa68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.427 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Instance 7907c492-fa28-419d-94ba-339df849c7be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.427 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.427 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.523 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.543 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.586 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:13:08 compute-0 nova_compute[186840]: 2026-02-27 17:13:08.587 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:09 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:09.278 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:13:09 compute-0 nova_compute[186840]: 2026-02-27 17:13:09.582 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:13:09 compute-0 nova_compute[186840]: 2026-02-27 17:13:09.583 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:13:09 compute-0 nova_compute[186840]: 2026-02-27 17:13:09.583 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:13:09 compute-0 nova_compute[186840]: 2026-02-27 17:13:09.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:13:09 compute-0 nova_compute[186840]: 2026-02-27 17:13:09.698 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:13:10 compute-0 nova_compute[186840]: 2026-02-27 17:13:10.158 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:10 compute-0 nova_compute[186840]: 2026-02-27 17:13:10.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:13:10 compute-0 nova_compute[186840]: 2026-02-27 17:13:10.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:13:10 compute-0 nova_compute[186840]: 2026-02-27 17:13:10.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:13:10 compute-0 nova_compute[186840]: 2026-02-27 17:13:10.734 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Feb 27 17:13:11 compute-0 nova_compute[186840]: 2026-02-27 17:13:11.213 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:13:11 compute-0 nova_compute[186840]: 2026-02-27 17:13:11.213 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquired lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:13:11 compute-0 nova_compute[186840]: 2026-02-27 17:13:11.213 186844 DEBUG nova.network.neutron [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 27 17:13:11 compute-0 nova_compute[186840]: 2026-02-27 17:13:11.213 186844 DEBUG nova.objects.instance [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 826adfc9-edc2-47cf-82ae-f8b79aebaa68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:13:11 compute-0 nova_compute[186840]: 2026-02-27 17:13:11.216 186844 DEBUG nova.network.neutron [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Successfully created port: 5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:13:11 compute-0 podman[215935]: 2026-02-27 17:13:11.701050794 +0000 UTC m=+0.100475257 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 27 17:13:11 compute-0 nova_compute[186840]: 2026-02-27 17:13:11.729 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:13 compute-0 nova_compute[186840]: 2026-02-27 17:13:13.506 186844 DEBUG nova.network.neutron [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Successfully updated port: 5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:13:13 compute-0 nova_compute[186840]: 2026-02-27 17:13:13.543 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-7907c492-fa28-419d-94ba-339df849c7be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:13:13 compute-0 nova_compute[186840]: 2026-02-27 17:13:13.543 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-7907c492-fa28-419d-94ba-339df849c7be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:13:13 compute-0 nova_compute[186840]: 2026-02-27 17:13:13.543 186844 DEBUG nova.network.neutron [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:13:13 compute-0 nova_compute[186840]: 2026-02-27 17:13:13.652 186844 DEBUG nova.compute.manager [req-825161f3-84f0-4926-aef5-cf312013d8a9 req-75565d84-23cb-4048-af18-cd82c69d733c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Received event network-changed-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:13:13 compute-0 nova_compute[186840]: 2026-02-27 17:13:13.652 186844 DEBUG nova.compute.manager [req-825161f3-84f0-4926-aef5-cf312013d8a9 req-75565d84-23cb-4048-af18-cd82c69d733c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Refreshing instance network info cache due to event network-changed-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:13:13 compute-0 nova_compute[186840]: 2026-02-27 17:13:13.653 186844 DEBUG oslo_concurrency.lockutils [req-825161f3-84f0-4926-aef5-cf312013d8a9 req-75565d84-23cb-4048-af18-cd82c69d733c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-7907c492-fa28-419d-94ba-339df849c7be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:13:14 compute-0 nova_compute[186840]: 2026-02-27 17:13:14.303 186844 DEBUG nova.network.neutron [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:13:14 compute-0 nova_compute[186840]: 2026-02-27 17:13:14.869 186844 DEBUG nova.network.neutron [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Updating instance_info_cache with network_info: [{"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:13:14 compute-0 nova_compute[186840]: 2026-02-27 17:13:14.892 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Releasing lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:13:14 compute-0 nova_compute[186840]: 2026-02-27 17:13:14.892 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 27 17:13:14 compute-0 nova_compute[186840]: 2026-02-27 17:13:14.893 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:13:15 compute-0 nova_compute[186840]: 2026-02-27 17:13:15.160 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.296 186844 DEBUG nova.network.neutron [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Updating instance_info_cache with network_info: [{"id": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "address": "fa:16:3e:65:d5:aa", "network": {"id": "57b86049-a87b-4ca2-9a21-6b9fa70855da", "bridge": "br-int", "label": "tempest-network-smoke--278307282", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ea75f63-b4", "ovs_interfaceid": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.324 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-7907c492-fa28-419d-94ba-339df849c7be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.325 186844 DEBUG nova.compute.manager [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Instance network_info: |[{"id": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "address": "fa:16:3e:65:d5:aa", "network": {"id": "57b86049-a87b-4ca2-9a21-6b9fa70855da", "bridge": "br-int", "label": "tempest-network-smoke--278307282", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ea75f63-b4", "ovs_interfaceid": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.325 186844 DEBUG oslo_concurrency.lockutils [req-825161f3-84f0-4926-aef5-cf312013d8a9 req-75565d84-23cb-4048-af18-cd82c69d733c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-7907c492-fa28-419d-94ba-339df849c7be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.326 186844 DEBUG nova.network.neutron [req-825161f3-84f0-4926-aef5-cf312013d8a9 req-75565d84-23cb-4048-af18-cd82c69d733c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Refreshing network info cache for port 5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.330 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Start _get_guest_xml network_info=[{"id": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "address": "fa:16:3e:65:d5:aa", "network": {"id": "57b86049-a87b-4ca2-9a21-6b9fa70855da", "bridge": "br-int", "label": "tempest-network-smoke--278307282", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ea75f63-b4", "ovs_interfaceid": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.336 186844 WARNING nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.348 186844 DEBUG nova.virt.libvirt.host [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.349 186844 DEBUG nova.virt.libvirt.host [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.355 186844 DEBUG nova.virt.libvirt.host [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.356 186844 DEBUG nova.virt.libvirt.host [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.357 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.357 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.358 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.358 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.358 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.359 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.359 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.359 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.360 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.360 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.361 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.361 186844 DEBUG nova.virt.hardware [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.367 186844 DEBUG nova.virt.libvirt.vif [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:13:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1662499746',display_name='tempest-TestNetworkBasicOps-server-1662499746',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1662499746',id=2,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPJpvJXjYyaP9weXRX/frAT3/yEysE0lVxAsjmpNGAzBT9Oj2vbtxwuS1na7jO3chcTPpLrlUFwb1XAnLTLReglOwXXu8OR5Kmokm6p8kdkdUiOWT63Ya5K18FBRbp4C4A==',key_name='tempest-TestNetworkBasicOps-1344904446',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-7jo6re60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:13:07Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=7907c492-fa28-419d-94ba-339df849c7be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "address": "fa:16:3e:65:d5:aa", "network": {"id": "57b86049-a87b-4ca2-9a21-6b9fa70855da", "bridge": "br-int", "label": "tempest-network-smoke--278307282", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ea75f63-b4", "ovs_interfaceid": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.368 186844 DEBUG nova.network.os_vif_util [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "address": "fa:16:3e:65:d5:aa", "network": {"id": "57b86049-a87b-4ca2-9a21-6b9fa70855da", "bridge": "br-int", "label": "tempest-network-smoke--278307282", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ea75f63-b4", "ovs_interfaceid": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.369 186844 DEBUG nova.network.os_vif_util [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:d5:aa,bridge_name='br-int',has_traffic_filtering=True,id=5ea75f63-b49d-4e41-8e27-d6f2a79d89b0,network=Network(57b86049-a87b-4ca2-9a21-6b9fa70855da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ea75f63-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.370 186844 DEBUG nova.objects.instance [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 7907c492-fa28-419d-94ba-339df849c7be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.385 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:13:16 compute-0 nova_compute[186840]:   <uuid>7907c492-fa28-419d-94ba-339df849c7be</uuid>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   <name>instance-00000002</name>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1662499746</nova:name>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:13:16</nova:creationTime>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:13:16 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:13:16 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:13:16 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:13:16 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:13:16 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:13:16 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:13:16 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:13:16 compute-0 nova_compute[186840]:         <nova:port uuid="5ea75f63-b49d-4e41-8e27-d6f2a79d89b0">
Feb 27 17:13:16 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <system>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <entry name="serial">7907c492-fa28-419d-94ba-339df849c7be</entry>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <entry name="uuid">7907c492-fa28-419d-94ba-339df849c7be</entry>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     </system>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   <os>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   </os>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   <features>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   </features>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk.config"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:65:d5:aa"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <target dev="tap5ea75f63-b4"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/console.log" append="off"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <video>
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     </video>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:13:16 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:13:16 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:13:16 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:13:16 compute-0 nova_compute[186840]: </domain>
Feb 27 17:13:16 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.387 186844 DEBUG nova.compute.manager [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Preparing to wait for external event network-vif-plugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.387 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "7907c492-fa28-419d-94ba-339df849c7be-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.387 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.387 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.388 186844 DEBUG nova.virt.libvirt.vif [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:13:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1662499746',display_name='tempest-TestNetworkBasicOps-server-1662499746',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1662499746',id=2,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPJpvJXjYyaP9weXRX/frAT3/yEysE0lVxAsjmpNGAzBT9Oj2vbtxwuS1na7jO3chcTPpLrlUFwb1XAnLTLReglOwXXu8OR5Kmokm6p8kdkdUiOWT63Ya5K18FBRbp4C4A==',key_name='tempest-TestNetworkBasicOps-1344904446',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-7jo6re60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:13:07Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=7907c492-fa28-419d-94ba-339df849c7be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "address": "fa:16:3e:65:d5:aa", "network": {"id": "57b86049-a87b-4ca2-9a21-6b9fa70855da", "bridge": "br-int", "label": "tempest-network-smoke--278307282", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ea75f63-b4", "ovs_interfaceid": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.388 186844 DEBUG nova.network.os_vif_util [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "address": "fa:16:3e:65:d5:aa", "network": {"id": "57b86049-a87b-4ca2-9a21-6b9fa70855da", "bridge": "br-int", "label": "tempest-network-smoke--278307282", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ea75f63-b4", "ovs_interfaceid": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.389 186844 DEBUG nova.network.os_vif_util [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:d5:aa,bridge_name='br-int',has_traffic_filtering=True,id=5ea75f63-b49d-4e41-8e27-d6f2a79d89b0,network=Network(57b86049-a87b-4ca2-9a21-6b9fa70855da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ea75f63-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.389 186844 DEBUG os_vif [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:d5:aa,bridge_name='br-int',has_traffic_filtering=True,id=5ea75f63-b49d-4e41-8e27-d6f2a79d89b0,network=Network(57b86049-a87b-4ca2-9a21-6b9fa70855da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ea75f63-b4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.390 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.390 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.391 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.394 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.395 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5ea75f63-b4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.395 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5ea75f63-b4, col_values=(('external_ids', {'iface-id': '5ea75f63-b49d-4e41-8e27-d6f2a79d89b0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:d5:aa', 'vm-uuid': '7907c492-fa28-419d-94ba-339df849c7be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.397 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:16 compute-0 NetworkManager[56537]: <info>  [1772212396.3983] manager: (tap5ea75f63-b4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.399 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.402 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.403 186844 INFO os_vif [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:d5:aa,bridge_name='br-int',has_traffic_filtering=True,id=5ea75f63-b49d-4e41-8e27-d6f2a79d89b0,network=Network(57b86049-a87b-4ca2-9a21-6b9fa70855da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ea75f63-b4')
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.456 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.456 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.457 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:65:d5:aa, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.457 186844 INFO nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Using config drive
Feb 27 17:13:16 compute-0 podman[215960]: 2026-02-27 17:13:16.671101744 +0000 UTC m=+0.083806074 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.859 186844 INFO nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Creating config drive at /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk.config
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.866 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp909_a518 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:13:16 compute-0 nova_compute[186840]: 2026-02-27 17:13:16.993 186844 DEBUG oslo_concurrency.processutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp909_a518" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:13:17 compute-0 kernel: tap5ea75f63-b4: entered promiscuous mode
Feb 27 17:13:17 compute-0 NetworkManager[56537]: <info>  [1772212397.0427] manager: (tap5ea75f63-b4): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Feb 27 17:13:17 compute-0 ovn_controller[96756]: 2026-02-27T17:13:17Z|00034|binding|INFO|Claiming lport 5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 for this chassis.
Feb 27 17:13:17 compute-0 ovn_controller[96756]: 2026-02-27T17:13:17Z|00035|binding|INFO|5ea75f63-b49d-4e41-8e27-d6f2a79d89b0: Claiming fa:16:3e:65:d5:aa 10.100.0.25
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.043 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.049 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.058 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:d5:aa 10.100.0.25'], port_security=['fa:16:3e:65:d5:aa 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '7907c492-fa28-419d-94ba-339df849c7be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57b86049-a87b-4ca2-9a21-6b9fa70855da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '57746e65-06e9-4e6e-834d-171c9ed58897', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a9854a9e-6d8f-43f4-b0b6-716f9aa4a647, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=5ea75f63-b49d-4e41-8e27-d6f2a79d89b0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:13:17 compute-0 ovn_controller[96756]: 2026-02-27T17:13:17Z|00036|binding|INFO|Setting lport 5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 ovn-installed in OVS
Feb 27 17:13:17 compute-0 ovn_controller[96756]: 2026-02-27T17:13:17Z|00037|binding|INFO|Setting lport 5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 up in Southbound
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.060 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 in datapath 57b86049-a87b-4ca2-9a21-6b9fa70855da bound to our chassis
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.061 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.062 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 57b86049-a87b-4ca2-9a21-6b9fa70855da
Feb 27 17:13:17 compute-0 systemd-machined[156136]: New machine qemu-2-instance-00000002.
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.073 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[0594912a-e464-464f-9d40-51ea956519cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.075 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap57b86049-a1 in ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.077 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap57b86049-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.077 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb9896b-b048-4052-8a91-aa87d04ea806]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.078 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[c7878ff8-98a0-45cd-af9a-de5d3a93fb51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Feb 27 17:13:17 compute-0 systemd-udevd[216004]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.102 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[78986963-6843-47a6-8c58-0a17f4c420a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 NetworkManager[56537]: <info>  [1772212397.1191] device (tap5ea75f63-b4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:13:17 compute-0 NetworkManager[56537]: <info>  [1772212397.1201] device (tap5ea75f63-b4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.127 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[76777b40-ab86-40e8-bd1d-424f203214a5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.148 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[7a09c765-7ef1-44a7-8249-97c7961b54cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.152 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[db46fd23-1926-45ff-b124-74cfce87761f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 NetworkManager[56537]: <info>  [1772212397.1535] manager: (tap57b86049-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.179 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[942cc030-0917-40fa-a767-160ee578de63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.181 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[2fc5458d-ce86-4ebd-8750-11908f695873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 NetworkManager[56537]: <info>  [1772212397.2014] device (tap57b86049-a0): carrier: link connected
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.205 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[346c7b66-dd6e-4dd7-9eaf-57a2600772da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.219 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[b9c8ffc3-e123-4761-b37e-316b14f936ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57b86049-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:2d:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327269, 'reachable_time': 43153, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216035, 'error': None, 'target': 'ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.229 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[056117d1-5d80-42b3-bf17-7822f2ef0f12]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:2d56'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 327269, 'tstamp': 327269}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216036, 'error': None, 'target': 'ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.244 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[11ce2ec8-1ea8-4e5a-ac11-a6cafe3fe8d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57b86049-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:2d:56'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327269, 'reachable_time': 43153, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216037, 'error': None, 'target': 'ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.266 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[a52867a0-104f-40da-9253-92fae244f625]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.302 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[204b270c-b4f5-490a-b36a-487d0758310d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.304 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57b86049-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.304 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.305 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57b86049-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:13:17 compute-0 kernel: tap57b86049-a0: entered promiscuous mode
Feb 27 17:13:17 compute-0 NetworkManager[56537]: <info>  [1772212397.3097] manager: (tap57b86049-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.310 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.314 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap57b86049-a0, col_values=(('external_ids', {'iface-id': '594740b9-a67c-44ad-832c-350a7ac64ea9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.315 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:17 compute-0 ovn_controller[96756]: 2026-02-27T17:13:17Z|00038|binding|INFO|Releasing lport 594740b9-a67c-44ad-832c-350a7ac64ea9 from this chassis (sb_readonly=0)
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.320 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.319 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/57b86049-a87b-4ca2-9a21-6b9fa70855da.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/57b86049-a87b-4ca2-9a21-6b9fa70855da.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.321 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[879acf5a-6117-42ac-89d7-6a69d758cbc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.323 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-57b86049-a87b-4ca2-9a21-6b9fa70855da
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/57b86049-a87b-4ca2-9a21-6b9fa70855da.pid.haproxy
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID 57b86049-a87b-4ca2-9a21-6b9fa70855da
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:13:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:17.324 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da', 'env', 'PROCESS_TAG=haproxy-57b86049-a87b-4ca2-9a21-6b9fa70855da', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/57b86049-a87b-4ca2-9a21-6b9fa70855da.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.447 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212397.446494, 7907c492-fa28-419d-94ba-339df849c7be => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.447 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] VM Started (Lifecycle Event)
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.480 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.485 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212397.4477906, 7907c492-fa28-419d-94ba-339df849c7be => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.485 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] VM Paused (Lifecycle Event)
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.509 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.513 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.547 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.572 186844 DEBUG nova.compute.manager [req-d5adb682-3530-43f1-a79b-02647c69609d req-756cb38c-7090-42f7-90e4-ca09c56a885c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Received event network-vif-plugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.572 186844 DEBUG oslo_concurrency.lockutils [req-d5adb682-3530-43f1-a79b-02647c69609d req-756cb38c-7090-42f7-90e4-ca09c56a885c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "7907c492-fa28-419d-94ba-339df849c7be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.573 186844 DEBUG oslo_concurrency.lockutils [req-d5adb682-3530-43f1-a79b-02647c69609d req-756cb38c-7090-42f7-90e4-ca09c56a885c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.573 186844 DEBUG oslo_concurrency.lockutils [req-d5adb682-3530-43f1-a79b-02647c69609d req-756cb38c-7090-42f7-90e4-ca09c56a885c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.574 186844 DEBUG nova.compute.manager [req-d5adb682-3530-43f1-a79b-02647c69609d req-756cb38c-7090-42f7-90e4-ca09c56a885c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Processing event network-vif-plugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.575 186844 DEBUG nova.compute.manager [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.578 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212397.5784318, 7907c492-fa28-419d-94ba-339df849c7be => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.579 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] VM Resumed (Lifecycle Event)
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.583 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.586 186844 INFO nova.virt.libvirt.driver [-] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Instance spawned successfully.
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.587 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:13:17 compute-0 podman[216076]: 2026-02-27 17:13:17.679390407 +0000 UTC m=+0.070797630 container create e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260223, io.buildah.version=1.43.0)
Feb 27 17:13:17 compute-0 systemd[1]: Started libpod-conmon-e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae.scope.
Feb 27 17:13:17 compute-0 podman[216076]: 2026-02-27 17:13:17.640536259 +0000 UTC m=+0.031943572 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:13:17 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:13:17 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64472d587e32ce6015e448865f0c20bebc4b7ceb9a5c1cab57e6c48f5db45483/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:13:17 compute-0 podman[216076]: 2026-02-27 17:13:17.765811394 +0000 UTC m=+0.157218667 container init e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0)
Feb 27 17:13:17 compute-0 podman[216076]: 2026-02-27 17:13:17.776108882 +0000 UTC m=+0.167516105 container start e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:13:17 compute-0 neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da[216091]: [NOTICE]   (216095) : New worker (216097) forked
Feb 27 17:13:17 compute-0 neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da[216091]: [NOTICE]   (216095) : Loading success.
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.827 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.832 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.864 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.874 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.875 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.876 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.876 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.877 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.878 186844 DEBUG nova.virt.libvirt.driver [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.933 186844 INFO nova.compute.manager [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Took 10.60 seconds to spawn the instance on the hypervisor.
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.933 186844 DEBUG nova.compute.manager [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:13:17 compute-0 nova_compute[186840]: 2026-02-27 17:13:17.994 186844 INFO nova.compute.manager [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Took 11.47 seconds to build instance.
Feb 27 17:13:18 compute-0 nova_compute[186840]: 2026-02-27 17:13:18.016 186844 DEBUG oslo_concurrency.lockutils [None req-c47a05c3-b894-4629-803f-c92d453290b5 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:18 compute-0 nova_compute[186840]: 2026-02-27 17:13:18.097 186844 DEBUG nova.network.neutron [req-825161f3-84f0-4926-aef5-cf312013d8a9 req-75565d84-23cb-4048-af18-cd82c69d733c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Updated VIF entry in instance network info cache for port 5ea75f63-b49d-4e41-8e27-d6f2a79d89b0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:13:18 compute-0 nova_compute[186840]: 2026-02-27 17:13:18.098 186844 DEBUG nova.network.neutron [req-825161f3-84f0-4926-aef5-cf312013d8a9 req-75565d84-23cb-4048-af18-cd82c69d733c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Updating instance_info_cache with network_info: [{"id": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "address": "fa:16:3e:65:d5:aa", "network": {"id": "57b86049-a87b-4ca2-9a21-6b9fa70855da", "bridge": "br-int", "label": "tempest-network-smoke--278307282", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ea75f63-b4", "ovs_interfaceid": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:13:18 compute-0 nova_compute[186840]: 2026-02-27 17:13:18.119 186844 DEBUG oslo_concurrency.lockutils [req-825161f3-84f0-4926-aef5-cf312013d8a9 req-75565d84-23cb-4048-af18-cd82c69d733c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-7907c492-fa28-419d-94ba-339df849c7be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:13:19 compute-0 nova_compute[186840]: 2026-02-27 17:13:19.660 186844 DEBUG nova.compute.manager [req-ed36af6a-7714-4821-b295-2b1022e4c8d4 req-3a4669f9-0a47-4cd9-bd39-c6a105d4ae5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Received event network-vif-plugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:13:19 compute-0 nova_compute[186840]: 2026-02-27 17:13:19.661 186844 DEBUG oslo_concurrency.lockutils [req-ed36af6a-7714-4821-b295-2b1022e4c8d4 req-3a4669f9-0a47-4cd9-bd39-c6a105d4ae5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "7907c492-fa28-419d-94ba-339df849c7be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:19 compute-0 nova_compute[186840]: 2026-02-27 17:13:19.662 186844 DEBUG oslo_concurrency.lockutils [req-ed36af6a-7714-4821-b295-2b1022e4c8d4 req-3a4669f9-0a47-4cd9-bd39-c6a105d4ae5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:19 compute-0 nova_compute[186840]: 2026-02-27 17:13:19.662 186844 DEBUG oslo_concurrency.lockutils [req-ed36af6a-7714-4821-b295-2b1022e4c8d4 req-3a4669f9-0a47-4cd9-bd39-c6a105d4ae5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:19 compute-0 nova_compute[186840]: 2026-02-27 17:13:19.663 186844 DEBUG nova.compute.manager [req-ed36af6a-7714-4821-b295-2b1022e4c8d4 req-3a4669f9-0a47-4cd9-bd39-c6a105d4ae5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] No waiting events found dispatching network-vif-plugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:13:19 compute-0 nova_compute[186840]: 2026-02-27 17:13:19.663 186844 WARNING nova.compute.manager [req-ed36af6a-7714-4821-b295-2b1022e4c8d4 req-3a4669f9-0a47-4cd9-bd39-c6a105d4ae5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Received unexpected event network-vif-plugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 for instance with vm_state active and task_state None.
Feb 27 17:13:20 compute-0 nova_compute[186840]: 2026-02-27 17:13:20.162 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:21 compute-0 nova_compute[186840]: 2026-02-27 17:13:21.398 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:25 compute-0 nova_compute[186840]: 2026-02-27 17:13:25.164 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:26 compute-0 nova_compute[186840]: 2026-02-27 17:13:26.400 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:28 compute-0 ovn_controller[96756]: 2026-02-27T17:13:28Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:65:d5:aa 10.100.0.25
Feb 27 17:13:28 compute-0 ovn_controller[96756]: 2026-02-27T17:13:28Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:d5:aa 10.100.0.25
Feb 27 17:13:28 compute-0 podman[216126]: 2026-02-27 17:13:28.639816411 +0000 UTC m=+0.049648370 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:13:30 compute-0 nova_compute[186840]: 2026-02-27 17:13:30.166 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:31 compute-0 nova_compute[186840]: 2026-02-27 17:13:31.402 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:31 compute-0 podman[216151]: 2026-02-27 17:13:31.655093349 +0000 UTC m=+0.061977818 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 27 17:13:34 compute-0 podman[216168]: 2026-02-27 17:13:34.673953092 +0000 UTC m=+0.084295076 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:13:35 compute-0 nova_compute[186840]: 2026-02-27 17:13:35.167 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:35 compute-0 nova_compute[186840]: 2026-02-27 17:13:35.760 186844 DEBUG oslo_concurrency.lockutils [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "7907c492-fa28-419d-94ba-339df849c7be" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:35 compute-0 nova_compute[186840]: 2026-02-27 17:13:35.761 186844 DEBUG oslo_concurrency.lockutils [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:35 compute-0 nova_compute[186840]: 2026-02-27 17:13:35.761 186844 DEBUG oslo_concurrency.lockutils [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "7907c492-fa28-419d-94ba-339df849c7be-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:35 compute-0 nova_compute[186840]: 2026-02-27 17:13:35.762 186844 DEBUG oslo_concurrency.lockutils [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:35 compute-0 nova_compute[186840]: 2026-02-27 17:13:35.762 186844 DEBUG oslo_concurrency.lockutils [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:35 compute-0 nova_compute[186840]: 2026-02-27 17:13:35.764 186844 INFO nova.compute.manager [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Terminating instance
Feb 27 17:13:35 compute-0 nova_compute[186840]: 2026-02-27 17:13:35.765 186844 DEBUG nova.compute.manager [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:13:35 compute-0 kernel: tap5ea75f63-b4 (unregistering): left promiscuous mode
Feb 27 17:13:35 compute-0 NetworkManager[56537]: <info>  [1772212415.7874] device (tap5ea75f63-b4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:13:35 compute-0 ovn_controller[96756]: 2026-02-27T17:13:35Z|00039|binding|INFO|Releasing lport 5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 from this chassis (sb_readonly=0)
Feb 27 17:13:35 compute-0 nova_compute[186840]: 2026-02-27 17:13:35.789 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:35 compute-0 ovn_controller[96756]: 2026-02-27T17:13:35Z|00040|binding|INFO|Setting lport 5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 down in Southbound
Feb 27 17:13:35 compute-0 ovn_controller[96756]: 2026-02-27T17:13:35Z|00041|binding|INFO|Removing iface tap5ea75f63-b4 ovn-installed in OVS
Feb 27 17:13:35 compute-0 nova_compute[186840]: 2026-02-27 17:13:35.792 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:35.800 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:d5:aa 10.100.0.25'], port_security=['fa:16:3e:65:d5:aa 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '7907c492-fa28-419d-94ba-339df849c7be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57b86049-a87b-4ca2-9a21-6b9fa70855da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '57746e65-06e9-4e6e-834d-171c9ed58897', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a9854a9e-6d8f-43f4-b0b6-716f9aa4a647, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=5ea75f63-b49d-4e41-8e27-d6f2a79d89b0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:13:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:35.802 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 in datapath 57b86049-a87b-4ca2-9a21-6b9fa70855da unbound from our chassis
Feb 27 17:13:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:35.804 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 57b86049-a87b-4ca2-9a21-6b9fa70855da, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:13:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:35.805 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[0a8f9217-b71a-4344-9060-a2cd8275e04f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:35 compute-0 nova_compute[186840]: 2026-02-27 17:13:35.806 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:35.806 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da namespace which is not needed anymore
Feb 27 17:13:35 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Feb 27 17:13:35 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 11.515s CPU time.
Feb 27 17:13:35 compute-0 systemd-machined[156136]: Machine qemu-2-instance-00000002 terminated.
Feb 27 17:13:35 compute-0 neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da[216091]: [NOTICE]   (216095) : haproxy version is 2.8.14-c23fe91
Feb 27 17:13:35 compute-0 neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da[216091]: [NOTICE]   (216095) : path to executable is /usr/sbin/haproxy
Feb 27 17:13:35 compute-0 neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da[216091]: [WARNING]  (216095) : Exiting Master process...
Feb 27 17:13:35 compute-0 neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da[216091]: [ALERT]    (216095) : Current worker (216097) exited with code 143 (Terminated)
Feb 27 17:13:35 compute-0 neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da[216091]: [WARNING]  (216095) : All workers exited. Exiting... (0)
Feb 27 17:13:35 compute-0 systemd[1]: libpod-e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae.scope: Deactivated successfully.
Feb 27 17:13:35 compute-0 podman[216221]: 2026-02-27 17:13:35.935811108 +0000 UTC m=+0.048465752 container died e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:13:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae-userdata-shm.mount: Deactivated successfully.
Feb 27 17:13:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-64472d587e32ce6015e448865f0c20bebc4b7ceb9a5c1cab57e6c48f5db45483-merged.mount: Deactivated successfully.
Feb 27 17:13:35 compute-0 podman[216221]: 2026-02-27 17:13:35.963186318 +0000 UTC m=+0.075840962 container cleanup e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, io.buildah.version=1.43.0)
Feb 27 17:13:35 compute-0 systemd[1]: libpod-conmon-e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae.scope: Deactivated successfully.
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.008 186844 INFO nova.virt.libvirt.driver [-] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Instance destroyed successfully.
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.008 186844 DEBUG nova.objects.instance [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid 7907c492-fa28-419d-94ba-339df849c7be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:13:36 compute-0 podman[216252]: 2026-02-27 17:13:36.017757556 +0000 UTC m=+0.036425540 container remove e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:13:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:36.020 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac6b5f8-67e6-4549-876f-b7d9eb0dd41d]: (4, ('Fri Feb 27 05:13:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da (e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae)\ne18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae\nFri Feb 27 05:13:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da (e18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae)\ne18b37ae92dc70f5a2cbafdc4cfd8c976e8f845ebffdef3fb2bf40da8e4841ae\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:36.022 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[6acc90b0-1df0-478c-b4f7-28f01959bf24]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:36.023 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57b86049-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:13:36 compute-0 kernel: tap57b86049-a0: left promiscuous mode
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.025 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.031 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:36.033 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[68aa226e-9971-491e-b1cd-e537d4d093c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.037 186844 DEBUG nova.virt.libvirt.vif [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:13:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1662499746',display_name='tempest-TestNetworkBasicOps-server-1662499746',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1662499746',id=2,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPJpvJXjYyaP9weXRX/frAT3/yEysE0lVxAsjmpNGAzBT9Oj2vbtxwuS1na7jO3chcTPpLrlUFwb1XAnLTLReglOwXXu8OR5Kmokm6p8kdkdUiOWT63Ya5K18FBRbp4C4A==',key_name='tempest-TestNetworkBasicOps-1344904446',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:13:17Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-7jo6re60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:13:17Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=7907c492-fa28-419d-94ba-339df849c7be,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "address": "fa:16:3e:65:d5:aa", "network": {"id": "57b86049-a87b-4ca2-9a21-6b9fa70855da", "bridge": "br-int", "label": "tempest-network-smoke--278307282", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ea75f63-b4", "ovs_interfaceid": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.037 186844 DEBUG nova.network.os_vif_util [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "address": "fa:16:3e:65:d5:aa", "network": {"id": "57b86049-a87b-4ca2-9a21-6b9fa70855da", "bridge": "br-int", "label": "tempest-network-smoke--278307282", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5ea75f63-b4", "ovs_interfaceid": "5ea75f63-b49d-4e41-8e27-d6f2a79d89b0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.038 186844 DEBUG nova.network.os_vif_util [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:d5:aa,bridge_name='br-int',has_traffic_filtering=True,id=5ea75f63-b49d-4e41-8e27-d6f2a79d89b0,network=Network(57b86049-a87b-4ca2-9a21-6b9fa70855da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ea75f63-b4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.038 186844 DEBUG os_vif [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:d5:aa,bridge_name='br-int',has_traffic_filtering=True,id=5ea75f63-b49d-4e41-8e27-d6f2a79d89b0,network=Network(57b86049-a87b-4ca2-9a21-6b9fa70855da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ea75f63-b4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.040 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.040 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5ea75f63-b4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.041 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.042 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.044 186844 INFO os_vif [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:d5:aa,bridge_name='br-int',has_traffic_filtering=True,id=5ea75f63-b49d-4e41-8e27-d6f2a79d89b0,network=Network(57b86049-a87b-4ca2-9a21-6b9fa70855da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5ea75f63-b4')
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.045 186844 INFO nova.virt.libvirt.driver [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Deleting instance files /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be_del
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.045 186844 INFO nova.virt.libvirt.driver [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Deletion of /var/lib/nova/instances/7907c492-fa28-419d-94ba-339df849c7be_del complete
Feb 27 17:13:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:36.055 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[eca70429-39ab-4a76-bdd9-7d441926023c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:36.056 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[a6a0c443-529b-444b-ac98-940ee56a53e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:36.069 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[72ffd68c-8505-4c49-8d2a-0b208bb4ba27]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 327263, 'reachable_time': 24016, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216288, 'error': None, 'target': 'ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.072 186844 DEBUG nova.compute.manager [req-73b1b2fb-4ebe-45c8-ae0d-c5a07c0a8c4c req-b66385e2-8422-47b1-971a-fd88becd7a04 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Received event network-vif-unplugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.072 186844 DEBUG oslo_concurrency.lockutils [req-73b1b2fb-4ebe-45c8-ae0d-c5a07c0a8c4c req-b66385e2-8422-47b1-971a-fd88becd7a04 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "7907c492-fa28-419d-94ba-339df849c7be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.072 186844 DEBUG oslo_concurrency.lockutils [req-73b1b2fb-4ebe-45c8-ae0d-c5a07c0a8c4c req-b66385e2-8422-47b1-971a-fd88becd7a04 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.073 186844 DEBUG oslo_concurrency.lockutils [req-73b1b2fb-4ebe-45c8-ae0d-c5a07c0a8c4c req-b66385e2-8422-47b1-971a-fd88becd7a04 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.073 186844 DEBUG nova.compute.manager [req-73b1b2fb-4ebe-45c8-ae0d-c5a07c0a8c4c req-b66385e2-8422-47b1-971a-fd88becd7a04 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] No waiting events found dispatching network-vif-unplugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.073 186844 DEBUG nova.compute.manager [req-73b1b2fb-4ebe-45c8-ae0d-c5a07c0a8c4c req-b66385e2-8422-47b1-971a-fd88becd7a04 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Received event network-vif-unplugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 27 17:13:36 compute-0 systemd[1]: run-netns-ovnmeta\x2d57b86049\x2da87b\x2d4ca2\x2d9a21\x2d6b9fa70855da.mount: Deactivated successfully.
Feb 27 17:13:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:36.077 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-57b86049-a87b-4ca2-9a21-6b9fa70855da deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:13:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:36.078 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[f8593138-763b-44b3-b996-f599573c98ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.144 186844 DEBUG nova.virt.libvirt.host [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.144 186844 INFO nova.virt.libvirt.host [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] UEFI support detected
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.146 186844 INFO nova.compute.manager [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Took 0.38 seconds to destroy the instance on the hypervisor.
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.146 186844 DEBUG oslo.service.loopingcall [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.147 186844 DEBUG nova.compute.manager [-] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:13:36 compute-0 nova_compute[186840]: 2026-02-27 17:13:36.147 186844 DEBUG nova.network.neutron [-] [instance: 7907c492-fa28-419d-94ba-339df849c7be] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:13:37 compute-0 podman[216290]: 2026-02-27 17:13:37.650225868 +0000 UTC m=+0.060679066 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, version=9.7, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.178 186844 DEBUG nova.compute.manager [req-06ae1ba9-13ec-48ab-9d60-ebfcd87f196d req-6acf25ba-a662-4d03-8642-df2d484b0ac1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Received event network-vif-plugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.179 186844 DEBUG oslo_concurrency.lockutils [req-06ae1ba9-13ec-48ab-9d60-ebfcd87f196d req-6acf25ba-a662-4d03-8642-df2d484b0ac1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "7907c492-fa28-419d-94ba-339df849c7be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.179 186844 DEBUG oslo_concurrency.lockutils [req-06ae1ba9-13ec-48ab-9d60-ebfcd87f196d req-6acf25ba-a662-4d03-8642-df2d484b0ac1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.179 186844 DEBUG oslo_concurrency.lockutils [req-06ae1ba9-13ec-48ab-9d60-ebfcd87f196d req-6acf25ba-a662-4d03-8642-df2d484b0ac1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.179 186844 DEBUG nova.compute.manager [req-06ae1ba9-13ec-48ab-9d60-ebfcd87f196d req-6acf25ba-a662-4d03-8642-df2d484b0ac1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] No waiting events found dispatching network-vif-plugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.180 186844 WARNING nova.compute.manager [req-06ae1ba9-13ec-48ab-9d60-ebfcd87f196d req-6acf25ba-a662-4d03-8642-df2d484b0ac1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Received unexpected event network-vif-plugged-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 for instance with vm_state active and task_state deleting.
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.643 186844 DEBUG nova.network.neutron [-] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.661 186844 INFO nova.compute.manager [-] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Took 2.51 seconds to deallocate network for instance.
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.727 186844 DEBUG oslo_concurrency.lockutils [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.728 186844 DEBUG oslo_concurrency.lockutils [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.841 186844 DEBUG nova.compute.provider_tree [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.903 186844 DEBUG nova.scheduler.client.report [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.922 186844 DEBUG oslo_concurrency.lockutils [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:38 compute-0 nova_compute[186840]: 2026-02-27 17:13:38.949 186844 INFO nova.scheduler.client.report [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance 7907c492-fa28-419d-94ba-339df849c7be
Feb 27 17:13:39 compute-0 nova_compute[186840]: 2026-02-27 17:13:39.053 186844 DEBUG oslo_concurrency.lockutils [None req-759ecd2a-5f9f-459c-8462-78bc7c1b4c64 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "7907c492-fa28-419d-94ba-339df849c7be" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:40 compute-0 nova_compute[186840]: 2026-02-27 17:13:40.169 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:40 compute-0 nova_compute[186840]: 2026-02-27 17:13:40.345 186844 DEBUG nova.compute.manager [req-e63bf715-c87d-47f0-b490-841f4a64dcb4 req-a03695df-64c3-4fe8-9e2a-7c7388a0a1c4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Received event network-vif-deleted-5ea75f63-b49d-4e41-8e27-d6f2a79d89b0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:13:41 compute-0 nova_compute[186840]: 2026-02-27 17:13:41.042 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:41 compute-0 ovn_controller[96756]: 2026-02-27T17:13:41Z|00042|binding|INFO|Releasing lport f65af9bf-2e56-4266-8f83-fdf23f039108 from this chassis (sb_readonly=0)
Feb 27 17:13:41 compute-0 nova_compute[186840]: 2026-02-27 17:13:41.667 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.630 186844 DEBUG nova.compute.manager [req-0987af76-0a32-4f20-897f-8462d358580c req-031aaf3e-5440-4766-ab4f-ed41e005c651 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Received event network-changed-230aa4f7-60f7-415f-98c3-f586b1551f43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.631 186844 DEBUG nova.compute.manager [req-0987af76-0a32-4f20-897f-8462d358580c req-031aaf3e-5440-4766-ab4f-ed41e005c651 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Refreshing instance network info cache due to event network-changed-230aa4f7-60f7-415f-98c3-f586b1551f43. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.631 186844 DEBUG oslo_concurrency.lockutils [req-0987af76-0a32-4f20-897f-8462d358580c req-031aaf3e-5440-4766-ab4f-ed41e005c651 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.632 186844 DEBUG oslo_concurrency.lockutils [req-0987af76-0a32-4f20-897f-8462d358580c req-031aaf3e-5440-4766-ab4f-ed41e005c651 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.632 186844 DEBUG nova.network.neutron [req-0987af76-0a32-4f20-897f-8462d358580c req-031aaf3e-5440-4766-ab4f-ed41e005c651 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Refreshing network info cache for port 230aa4f7-60f7-415f-98c3-f586b1551f43 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:13:42 compute-0 podman[216312]: 2026-02-27 17:13:42.671449661 +0000 UTC m=+0.077606997 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.728 186844 DEBUG oslo_concurrency.lockutils [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.729 186844 DEBUG oslo_concurrency.lockutils [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.729 186844 DEBUG oslo_concurrency.lockutils [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.729 186844 DEBUG oslo_concurrency.lockutils [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.729 186844 DEBUG oslo_concurrency.lockutils [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.730 186844 INFO nova.compute.manager [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Terminating instance
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.731 186844 DEBUG nova.compute.manager [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:13:42 compute-0 kernel: tap230aa4f7-60 (unregistering): left promiscuous mode
Feb 27 17:13:42 compute-0 NetworkManager[56537]: <info>  [1772212422.7615] device (tap230aa4f7-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.761 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:42 compute-0 ovn_controller[96756]: 2026-02-27T17:13:42Z|00043|binding|INFO|Releasing lport 230aa4f7-60f7-415f-98c3-f586b1551f43 from this chassis (sb_readonly=0)
Feb 27 17:13:42 compute-0 ovn_controller[96756]: 2026-02-27T17:13:42Z|00044|binding|INFO|Setting lport 230aa4f7-60f7-415f-98c3-f586b1551f43 down in Southbound
Feb 27 17:13:42 compute-0 ovn_controller[96756]: 2026-02-27T17:13:42Z|00045|binding|INFO|Removing iface tap230aa4f7-60 ovn-installed in OVS
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.763 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:42.772 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:2f:65 10.100.0.6'], port_security=['fa:16:3e:53:2f:65 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '826adfc9-edc2-47cf-82ae-f8b79aebaa68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8cc3eca5-483a-473a-bac8-5f86e54d4447', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1e09f223-d0ad-4562-8f86-7f3bcd96c7c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bc7bb2cb-ff0b-4036-98ea-f69638f43f43, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=230aa4f7-60f7-415f-98c3-f586b1551f43) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:13:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:42.775 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 230aa4f7-60f7-415f-98c3-f586b1551f43 in datapath 8cc3eca5-483a-473a-bac8-5f86e54d4447 unbound from our chassis
Feb 27 17:13:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:42.777 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8cc3eca5-483a-473a-bac8-5f86e54d4447, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.777 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:42.778 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[08b8559b-8847-4583-baf1-fefaa28a0b45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:42.780 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447 namespace which is not needed anymore
Feb 27 17:13:42 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Feb 27 17:13:42 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 13.577s CPU time.
Feb 27 17:13:42 compute-0 systemd-machined[156136]: Machine qemu-1-instance-00000001 terminated.
Feb 27 17:13:42 compute-0 neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447[215748]: [NOTICE]   (215752) : haproxy version is 2.8.14-c23fe91
Feb 27 17:13:42 compute-0 neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447[215748]: [NOTICE]   (215752) : path to executable is /usr/sbin/haproxy
Feb 27 17:13:42 compute-0 neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447[215748]: [WARNING]  (215752) : Exiting Master process...
Feb 27 17:13:42 compute-0 neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447[215748]: [WARNING]  (215752) : Exiting Master process...
Feb 27 17:13:42 compute-0 neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447[215748]: [ALERT]    (215752) : Current worker (215754) exited with code 143 (Terminated)
Feb 27 17:13:42 compute-0 neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447[215748]: [WARNING]  (215752) : All workers exited. Exiting... (0)
Feb 27 17:13:42 compute-0 systemd[1]: libpod-10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c.scope: Deactivated successfully.
Feb 27 17:13:42 compute-0 podman[216358]: 2026-02-27 17:13:42.942547954 +0000 UTC m=+0.055521540 container died 10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260223)
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.948 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.955 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.979 186844 INFO nova.virt.libvirt.driver [-] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Instance destroyed successfully.
Feb 27 17:13:42 compute-0 nova_compute[186840]: 2026-02-27 17:13:42.980 186844 DEBUG nova.objects.instance [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid 826adfc9-edc2-47cf-82ae-f8b79aebaa68 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:13:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c-userdata-shm.mount: Deactivated successfully.
Feb 27 17:13:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-eee797cadf0e20cde99125dcf10a7401e7b5a4c1b5464b43abbbd8897e6d162f-merged.mount: Deactivated successfully.
Feb 27 17:13:43 compute-0 podman[216358]: 2026-02-27 17:13:43.000047269 +0000 UTC m=+0.113020885 container cleanup 10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.005 186844 DEBUG nova.virt.libvirt.vif [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:12:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1769989291',display_name='tempest-TestNetworkBasicOps-server-1769989291',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1769989291',id=1,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChDldA9eHt69UmmFDwvFUnEpXA7RKYuw4U1QoiTTfttm2GMj/uAfp8mL+79aV7KIdshvzwUkOP1mGNagvdQMwWrNtdrRQcXKglkmafYrQN13J3tRiiJ795KKNmbNnHXUQ==',key_name='tempest-TestNetworkBasicOps-1253152918',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:12:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-dty06rmr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:12:33Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=826adfc9-edc2-47cf-82ae-f8b79aebaa68,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:13:43 compute-0 systemd[1]: libpod-conmon-10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c.scope: Deactivated successfully.
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.006 186844 DEBUG nova.network.os_vif_util [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.006 186844 DEBUG nova.network.os_vif_util [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:53:2f:65,bridge_name='br-int',has_traffic_filtering=True,id=230aa4f7-60f7-415f-98c3-f586b1551f43,network=Network(8cc3eca5-483a-473a-bac8-5f86e54d4447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap230aa4f7-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.007 186844 DEBUG os_vif [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:2f:65,bridge_name='br-int',has_traffic_filtering=True,id=230aa4f7-60f7-415f-98c3-f586b1551f43,network=Network(8cc3eca5-483a-473a-bac8-5f86e54d4447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap230aa4f7-60') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.008 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.008 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap230aa4f7-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.009 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.011 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.013 186844 INFO os_vif [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:2f:65,bridge_name='br-int',has_traffic_filtering=True,id=230aa4f7-60f7-415f-98c3-f586b1551f43,network=Network(8cc3eca5-483a-473a-bac8-5f86e54d4447),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap230aa4f7-60')
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.013 186844 INFO nova.virt.libvirt.driver [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Deleting instance files /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68_del
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.013 186844 INFO nova.virt.libvirt.driver [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Deletion of /var/lib/nova/instances/826adfc9-edc2-47cf-82ae-f8b79aebaa68_del complete
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.062 186844 DEBUG nova.compute.manager [req-11db8f31-9a44-48fa-a4f4-b310fbba0dfe req-6da91e99-556f-4e0c-a829-27d6ec686cbf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Received event network-vif-unplugged-230aa4f7-60f7-415f-98c3-f586b1551f43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.062 186844 DEBUG oslo_concurrency.lockutils [req-11db8f31-9a44-48fa-a4f4-b310fbba0dfe req-6da91e99-556f-4e0c-a829-27d6ec686cbf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.062 186844 DEBUG oslo_concurrency.lockutils [req-11db8f31-9a44-48fa-a4f4-b310fbba0dfe req-6da91e99-556f-4e0c-a829-27d6ec686cbf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.062 186844 DEBUG oslo_concurrency.lockutils [req-11db8f31-9a44-48fa-a4f4-b310fbba0dfe req-6da91e99-556f-4e0c-a829-27d6ec686cbf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.063 186844 DEBUG nova.compute.manager [req-11db8f31-9a44-48fa-a4f4-b310fbba0dfe req-6da91e99-556f-4e0c-a829-27d6ec686cbf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] No waiting events found dispatching network-vif-unplugged-230aa4f7-60f7-415f-98c3-f586b1551f43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.063 186844 DEBUG nova.compute.manager [req-11db8f31-9a44-48fa-a4f4-b310fbba0dfe req-6da91e99-556f-4e0c-a829-27d6ec686cbf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Received event network-vif-unplugged-230aa4f7-60f7-415f-98c3-f586b1551f43 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 27 17:13:43 compute-0 podman[216404]: 2026-02-27 17:13:43.077406129 +0000 UTC m=+0.051012474 container remove 10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.vendor=CentOS)
Feb 27 17:13:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:43.082 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[c557d890-3c1e-4586-8cf1-3f9ec95e1538]: (4, ('Fri Feb 27 05:13:42 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447 (10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c)\n10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c\nFri Feb 27 05:13:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447 (10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c)\n10fc6517b8aa65df5e96386d41658ae641a0125c219d29c4be59ed563d8b780c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:43.084 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[df103a8f-b1f8-4981-8396-0cd76d2c6efa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:43.085 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8cc3eca5-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.087 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:43 compute-0 kernel: tap8cc3eca5-40: left promiscuous mode
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.089 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:43.091 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[7f000c83-0a3b-48ad-963a-668fa0ea4932]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.094 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:43.109 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[64334809-ad30-4ba6-9d13-b7b235a24333]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:43.111 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[cd199c4b-8445-45e6-9cd7-1ebb390d0f2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:43.125 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[4f8fc9b0-1a1a-415a-9e57-a2acc408c0cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 323165, 'reachable_time': 44797, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216419, 'error': None, 'target': 'ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:43.128 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8cc3eca5-483a-473a-bac8-5f86e54d4447 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:13:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:43.128 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[d81f7df5-3525-4811-b40b-2e4f335d58e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:13:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d8cc3eca5\x2d483a\x2d473a\x2dbac8\x2d5f86e54d4447.mount: Deactivated successfully.
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.262 186844 INFO nova.compute.manager [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Took 0.53 seconds to destroy the instance on the hypervisor.
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.263 186844 DEBUG oslo.service.loopingcall [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.264 186844 DEBUG nova.compute.manager [-] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:13:43 compute-0 nova_compute[186840]: 2026-02-27 17:13:43.264 186844 DEBUG nova.network.neutron [-] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.130 186844 DEBUG nova.network.neutron [req-0987af76-0a32-4f20-897f-8462d358580c req-031aaf3e-5440-4766-ab4f-ed41e005c651 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Updated VIF entry in instance network info cache for port 230aa4f7-60f7-415f-98c3-f586b1551f43. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.131 186844 DEBUG nova.network.neutron [req-0987af76-0a32-4f20-897f-8462d358580c req-031aaf3e-5440-4766-ab4f-ed41e005c651 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Updating instance_info_cache with network_info: [{"id": "230aa4f7-60f7-415f-98c3-f586b1551f43", "address": "fa:16:3e:53:2f:65", "network": {"id": "8cc3eca5-483a-473a-bac8-5f86e54d4447", "bridge": "br-int", "label": "tempest-network-smoke--1938563289", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap230aa4f7-60", "ovs_interfaceid": "230aa4f7-60f7-415f-98c3-f586b1551f43", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.171 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.175 186844 DEBUG oslo_concurrency.lockutils [req-0987af76-0a32-4f20-897f-8462d358580c req-031aaf3e-5440-4766-ab4f-ed41e005c651 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-826adfc9-edc2-47cf-82ae-f8b79aebaa68" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.176 186844 DEBUG nova.network.neutron [-] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.198 186844 INFO nova.compute.manager [-] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Took 1.93 seconds to deallocate network for instance.
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.247 186844 DEBUG nova.compute.manager [req-dc665acc-146b-42b8-af9d-126893b8cb3f req-9ac7476f-6c70-4993-9681-39e994239785 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Received event network-vif-plugged-230aa4f7-60f7-415f-98c3-f586b1551f43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.248 186844 DEBUG oslo_concurrency.lockutils [req-dc665acc-146b-42b8-af9d-126893b8cb3f req-9ac7476f-6c70-4993-9681-39e994239785 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.248 186844 DEBUG oslo_concurrency.lockutils [req-dc665acc-146b-42b8-af9d-126893b8cb3f req-9ac7476f-6c70-4993-9681-39e994239785 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.248 186844 DEBUG oslo_concurrency.lockutils [req-dc665acc-146b-42b8-af9d-126893b8cb3f req-9ac7476f-6c70-4993-9681-39e994239785 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.249 186844 DEBUG nova.compute.manager [req-dc665acc-146b-42b8-af9d-126893b8cb3f req-9ac7476f-6c70-4993-9681-39e994239785 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] No waiting events found dispatching network-vif-plugged-230aa4f7-60f7-415f-98c3-f586b1551f43 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.249 186844 WARNING nova.compute.manager [req-dc665acc-146b-42b8-af9d-126893b8cb3f req-9ac7476f-6c70-4993-9681-39e994239785 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Received unexpected event network-vif-plugged-230aa4f7-60f7-415f-98c3-f586b1551f43 for instance with vm_state active and task_state deleting.
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.257 186844 DEBUG oslo_concurrency.lockutils [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.258 186844 DEBUG oslo_concurrency.lockutils [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.269 186844 DEBUG nova.compute.manager [req-ac5b2a55-1d84-4f8a-9d41-6891df61d68e req-ab600584-0f6c-4495-8902-4055e76c49d1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Received event network-vif-deleted-230aa4f7-60f7-415f-98c3-f586b1551f43 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.348 186844 DEBUG nova.compute.provider_tree [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.370 186844 DEBUG nova.scheduler.client.report [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.399 186844 DEBUG oslo_concurrency.lockutils [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.441 186844 INFO nova.scheduler.client.report [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance 826adfc9-edc2-47cf-82ae-f8b79aebaa68
Feb 27 17:13:45 compute-0 nova_compute[186840]: 2026-02-27 17:13:45.519 186844 DEBUG oslo_concurrency.lockutils [None req-d81b020f-c94a-44f6-906e-c14b7e0b97cb 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "826adfc9-edc2-47cf-82ae-f8b79aebaa68" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.790s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:47.088 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:13:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:47.089 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:13:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:13:47.089 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:13:47 compute-0 podman[216420]: 2026-02-27 17:13:47.662750874 +0000 UTC m=+0.063363224 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:13:48 compute-0 nova_compute[186840]: 2026-02-27 17:13:48.010 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:50 compute-0 nova_compute[186840]: 2026-02-27 17:13:50.198 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:50 compute-0 nova_compute[186840]: 2026-02-27 17:13:50.595 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:50 compute-0 nova_compute[186840]: 2026-02-27 17:13:50.639 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:51 compute-0 nova_compute[186840]: 2026-02-27 17:13:51.007 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212416.0058196, 7907c492-fa28-419d-94ba-339df849c7be => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:13:51 compute-0 nova_compute[186840]: 2026-02-27 17:13:51.008 186844 INFO nova.compute.manager [-] [instance: 7907c492-fa28-419d-94ba-339df849c7be] VM Stopped (Lifecycle Event)
Feb 27 17:13:51 compute-0 nova_compute[186840]: 2026-02-27 17:13:51.032 186844 DEBUG nova.compute.manager [None req-ac9094d5-60a3-44e7-93bd-045f839b066b - - - - - -] [instance: 7907c492-fa28-419d-94ba-339df849c7be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:13:53 compute-0 nova_compute[186840]: 2026-02-27 17:13:53.012 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:55 compute-0 nova_compute[186840]: 2026-02-27 17:13:55.199 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:57 compute-0 nova_compute[186840]: 2026-02-27 17:13:57.979 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212422.9772139, 826adfc9-edc2-47cf-82ae-f8b79aebaa68 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:13:57 compute-0 nova_compute[186840]: 2026-02-27 17:13:57.980 186844 INFO nova.compute.manager [-] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] VM Stopped (Lifecycle Event)
Feb 27 17:13:58 compute-0 nova_compute[186840]: 2026-02-27 17:13:58.016 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:13:58 compute-0 nova_compute[186840]: 2026-02-27 17:13:58.045 186844 DEBUG nova.compute.manager [None req-51b88640-28a4-4041-adbe-d1ce8c617bea - - - - - -] [instance: 826adfc9-edc2-47cf-82ae-f8b79aebaa68] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:13:59 compute-0 podman[216446]: 2026-02-27 17:13:59.682201514 +0000 UTC m=+0.086587507 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 27 17:14:00 compute-0 nova_compute[186840]: 2026-02-27 17:14:00.294 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:02 compute-0 nova_compute[186840]: 2026-02-27 17:14:02.347 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:02 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:02.347 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:14:02 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:02.349 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:14:02 compute-0 podman[216470]: 2026-02-27 17:14:02.632142634 +0000 UTC m=+0.041134464 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 27 17:14:03 compute-0 nova_compute[186840]: 2026-02-27 17:14:03.029 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:05 compute-0 nova_compute[186840]: 2026-02-27 17:14:05.217 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.267 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:14:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:14:05 compute-0 podman[216489]: 2026-02-27 17:14:05.66609566 +0000 UTC m=+0.073650944 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Feb 27 17:14:07 compute-0 nova_compute[186840]: 2026-02-27 17:14:07.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.031 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:08 compute-0 podman[216516]: 2026-02-27 17:14:08.658465633 +0000 UTC m=+0.060575909 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=)
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.694 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.697 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.697 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.758 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.759 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.759 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.759 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.908 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.911 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5795MB free_disk=73.1952133178711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.911 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.911 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.968 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.969 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:14:08 compute-0 nova_compute[186840]: 2026-02-27 17:14:08.991 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:14:09 compute-0 nova_compute[186840]: 2026-02-27 17:14:09.013 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:14:09 compute-0 nova_compute[186840]: 2026-02-27 17:14:09.040 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:14:09 compute-0 nova_compute[186840]: 2026-02-27 17:14:09.041 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:09 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:09.352 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:10 compute-0 nova_compute[186840]: 2026-02-27 17:14:10.045 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:14:10 compute-0 nova_compute[186840]: 2026-02-27 17:14:10.225 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:10 compute-0 nova_compute[186840]: 2026-02-27 17:14:10.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:14:10 compute-0 nova_compute[186840]: 2026-02-27 17:14:10.701 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:14:10 compute-0 nova_compute[186840]: 2026-02-27 17:14:10.701 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.338 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.339 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.364 186844 DEBUG nova.compute.manager [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.451 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.452 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.460 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.460 186844 INFO nova.compute.claims [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.718 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.718 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.719 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.762 186844 DEBUG nova.compute.provider_tree [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.779 186844 DEBUG nova.scheduler.client.report [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.803 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.351s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.804 186844 DEBUG nova.compute.manager [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.861 186844 DEBUG nova.compute.manager [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.861 186844 DEBUG nova.network.neutron [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.891 186844 INFO nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:14:12 compute-0 nova_compute[186840]: 2026-02-27 17:14:12.917 186844 DEBUG nova.compute.manager [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.032 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.036 186844 DEBUG nova.compute.manager [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.037 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.038 186844 INFO nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Creating image(s)
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.038 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.039 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.039 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.054 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.113 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.115 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.115 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.126 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.173 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.175 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.203 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.204 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.089s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.204 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.257 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.258 186844 DEBUG nova.virt.disk.api [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.258 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.307 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.308 186844 DEBUG nova.virt.disk.api [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.309 186844 DEBUG nova.objects.instance [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid 32210ed6-c54d-46f8-8f4c-28d3ebd66edb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.331 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.331 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Ensure instance console log exists: /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.332 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.332 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.332 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:13 compute-0 nova_compute[186840]: 2026-02-27 17:14:13.345 186844 DEBUG nova.policy [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:14:13 compute-0 podman[216554]: 2026-02-27 17:14:13.650680805 +0000 UTC m=+0.060291471 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260223)
Feb 27 17:14:14 compute-0 nova_compute[186840]: 2026-02-27 17:14:14.361 186844 DEBUG nova.network.neutron [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Successfully created port: d12ccaa4-2084-47aa-9d33-a6fe5a03b378 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:14:15 compute-0 nova_compute[186840]: 2026-02-27 17:14:15.230 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:15 compute-0 nova_compute[186840]: 2026-02-27 17:14:15.507 186844 DEBUG nova.network.neutron [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Successfully updated port: d12ccaa4-2084-47aa-9d33-a6fe5a03b378 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:14:15 compute-0 nova_compute[186840]: 2026-02-27 17:14:15.528 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:14:15 compute-0 nova_compute[186840]: 2026-02-27 17:14:15.528 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:14:15 compute-0 nova_compute[186840]: 2026-02-27 17:14:15.529 186844 DEBUG nova.network.neutron [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:14:15 compute-0 nova_compute[186840]: 2026-02-27 17:14:15.616 186844 DEBUG nova.compute.manager [req-33d2ad60-65c8-4032-95ce-4f7869aef20a req-69569013-4102-4090-93e5-37b294e6a5df 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-changed-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:14:15 compute-0 nova_compute[186840]: 2026-02-27 17:14:15.617 186844 DEBUG nova.compute.manager [req-33d2ad60-65c8-4032-95ce-4f7869aef20a req-69569013-4102-4090-93e5-37b294e6a5df 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Refreshing instance network info cache due to event network-changed-d12ccaa4-2084-47aa-9d33-a6fe5a03b378. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:14:15 compute-0 nova_compute[186840]: 2026-02-27 17:14:15.617 186844 DEBUG oslo_concurrency.lockutils [req-33d2ad60-65c8-4032-95ce-4f7869aef20a req-69569013-4102-4090-93e5-37b294e6a5df 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:14:15 compute-0 nova_compute[186840]: 2026-02-27 17:14:15.698 186844 DEBUG nova.network.neutron [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.780 186844 DEBUG nova.network.neutron [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Updating instance_info_cache with network_info: [{"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.813 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.814 186844 DEBUG nova.compute.manager [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Instance network_info: |[{"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.814 186844 DEBUG oslo_concurrency.lockutils [req-33d2ad60-65c8-4032-95ce-4f7869aef20a req-69569013-4102-4090-93e5-37b294e6a5df 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.814 186844 DEBUG nova.network.neutron [req-33d2ad60-65c8-4032-95ce-4f7869aef20a req-69569013-4102-4090-93e5-37b294e6a5df 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Refreshing network info cache for port d12ccaa4-2084-47aa-9d33-a6fe5a03b378 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.818 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Start _get_guest_xml network_info=[{"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.822 186844 WARNING nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.828 186844 DEBUG nova.virt.libvirt.host [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.829 186844 DEBUG nova.virt.libvirt.host [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.833 186844 DEBUG nova.virt.libvirt.host [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.834 186844 DEBUG nova.virt.libvirt.host [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.834 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.835 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.836 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.836 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.837 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.837 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.837 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.838 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.838 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.839 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.839 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.840 186844 DEBUG nova.virt.hardware [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.845 186844 DEBUG nova.virt.libvirt.vif [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:14:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1925262689',display_name='tempest-TestNetworkBasicOps-server-1925262689',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1925262689',id=3,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHqbE6pNlkxoO0W+b+x7Q5g9jRx5fKZTzIT5cnCj1S25nNet8TQur8wbdQf3bGJS1oI9BApghVwZr93w6YpPYwcwqUuSnPqAVK/PCzDBCyGUWFaE32AQbQNowz5d77OzIQ==',key_name='tempest-TestNetworkBasicOps-1661968032',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-43wgwng5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:14:12Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=32210ed6-c54d-46f8-8f4c-28d3ebd66edb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.846 186844 DEBUG nova.network.os_vif_util [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.847 186844 DEBUG nova.network.os_vif_util [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:25:bc,bridge_name='br-int',has_traffic_filtering=True,id=d12ccaa4-2084-47aa-9d33-a6fe5a03b378,network=Network(855ecfce-44da-448e-8c49-042beea13b27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd12ccaa4-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.849 186844 DEBUG nova.objects.instance [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 32210ed6-c54d-46f8-8f4c-28d3ebd66edb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.868 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:14:16 compute-0 nova_compute[186840]:   <uuid>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</uuid>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   <name>instance-00000003</name>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1925262689</nova:name>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:14:16</nova:creationTime>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:14:16 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:14:16 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:14:16 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:14:16 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:14:16 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:14:16 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:14:16 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:14:16 compute-0 nova_compute[186840]:         <nova:port uuid="d12ccaa4-2084-47aa-9d33-a6fe5a03b378">
Feb 27 17:14:16 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <system>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <entry name="serial">32210ed6-c54d-46f8-8f4c-28d3ebd66edb</entry>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <entry name="uuid">32210ed6-c54d-46f8-8f4c-28d3ebd66edb</entry>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     </system>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   <os>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   </os>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   <features>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   </features>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk.config"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:ff:25:bc"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <target dev="tapd12ccaa4-20"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/console.log" append="off"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <video>
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     </video>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:14:16 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:14:16 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:14:16 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:14:16 compute-0 nova_compute[186840]: </domain>
Feb 27 17:14:16 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.869 186844 DEBUG nova.compute.manager [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Preparing to wait for external event network-vif-plugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.870 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.870 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.871 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.872 186844 DEBUG nova.virt.libvirt.vif [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:14:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1925262689',display_name='tempest-TestNetworkBasicOps-server-1925262689',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1925262689',id=3,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHqbE6pNlkxoO0W+b+x7Q5g9jRx5fKZTzIT5cnCj1S25nNet8TQur8wbdQf3bGJS1oI9BApghVwZr93w6YpPYwcwqUuSnPqAVK/PCzDBCyGUWFaE32AQbQNowz5d77OzIQ==',key_name='tempest-TestNetworkBasicOps-1661968032',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-43wgwng5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:14:12Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=32210ed6-c54d-46f8-8f4c-28d3ebd66edb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.872 186844 DEBUG nova.network.os_vif_util [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.873 186844 DEBUG nova.network.os_vif_util [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:25:bc,bridge_name='br-int',has_traffic_filtering=True,id=d12ccaa4-2084-47aa-9d33-a6fe5a03b378,network=Network(855ecfce-44da-448e-8c49-042beea13b27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd12ccaa4-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.874 186844 DEBUG os_vif [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:25:bc,bridge_name='br-int',has_traffic_filtering=True,id=d12ccaa4-2084-47aa-9d33-a6fe5a03b378,network=Network(855ecfce-44da-448e-8c49-042beea13b27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd12ccaa4-20') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.875 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.875 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.876 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.881 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.881 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd12ccaa4-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.882 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd12ccaa4-20, col_values=(('external_ids', {'iface-id': 'd12ccaa4-2084-47aa-9d33-a6fe5a03b378', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:25:bc', 'vm-uuid': '32210ed6-c54d-46f8-8f4c-28d3ebd66edb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.884 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:16 compute-0 NetworkManager[56537]: <info>  [1772212456.8856] manager: (tapd12ccaa4-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.887 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.889 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.890 186844 INFO os_vif [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:25:bc,bridge_name='br-int',has_traffic_filtering=True,id=d12ccaa4-2084-47aa-9d33-a6fe5a03b378,network=Network(855ecfce-44da-448e-8c49-042beea13b27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd12ccaa4-20')
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.943 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.944 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.945 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:ff:25:bc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:14:16 compute-0 nova_compute[186840]: 2026-02-27 17:14:16.945 186844 INFO nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Using config drive
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.363 186844 INFO nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Creating config drive at /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk.config
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.370 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp3sqlcxca execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.491 186844 DEBUG oslo_concurrency.processutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp3sqlcxca" returned: 0 in 0.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:14:18 compute-0 kernel: tapd12ccaa4-20: entered promiscuous mode
Feb 27 17:14:18 compute-0 NetworkManager[56537]: <info>  [1772212458.5329] manager: (tapd12ccaa4-20): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Feb 27 17:14:18 compute-0 ovn_controller[96756]: 2026-02-27T17:14:18Z|00046|binding|INFO|Claiming lport d12ccaa4-2084-47aa-9d33-a6fe5a03b378 for this chassis.
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.533 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:18 compute-0 ovn_controller[96756]: 2026-02-27T17:14:18Z|00047|binding|INFO|d12ccaa4-2084-47aa-9d33-a6fe5a03b378: Claiming fa:16:3e:ff:25:bc 10.100.0.12
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.536 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.538 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.549 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:25:bc 10.100.0.12'], port_security=['fa:16:3e:ff:25:bc 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '32210ed6-c54d-46f8-8f4c-28d3ebd66edb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-855ecfce-44da-448e-8c49-042beea13b27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '06ab3f43-d57d-4eb1-a403-6b03d8fb5d78', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=554d2234-f016-4ddb-80d6-cd7df22dc480, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=d12ccaa4-2084-47aa-9d33-a6fe5a03b378) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.550 106085 INFO neutron.agent.ovn.metadata.agent [-] Port d12ccaa4-2084-47aa-9d33-a6fe5a03b378 in datapath 855ecfce-44da-448e-8c49-042beea13b27 bound to our chassis
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.551 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 855ecfce-44da-448e-8c49-042beea13b27
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.553 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:18 compute-0 ovn_controller[96756]: 2026-02-27T17:14:18Z|00048|binding|INFO|Setting lport d12ccaa4-2084-47aa-9d33-a6fe5a03b378 ovn-installed in OVS
Feb 27 17:14:18 compute-0 ovn_controller[96756]: 2026-02-27T17:14:18Z|00049|binding|INFO|Setting lport d12ccaa4-2084-47aa-9d33-a6fe5a03b378 up in Southbound
Feb 27 17:14:18 compute-0 systemd-machined[156136]: New machine qemu-3-instance-00000003.
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.559 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[14a06e65-900c-4959-8ead-9c3121901861]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.560 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap855ecfce-41 in ovnmeta-855ecfce-44da-448e-8c49-042beea13b27 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.561 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap855ecfce-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.562 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[e3f04efa-a73f-4b57-ab77-acf0afbc7f02]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.558 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.562 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[e8806c79-3af6-4b1b-b652-4f6eb0f93cae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.571 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[2dff7f93-960f-4383-8865-d3f18b56c222]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.581 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[7bd2be90-1fa9-45ec-86aa-120f02e7e859]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 systemd-udevd[216619]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:14:18 compute-0 podman[216580]: 2026-02-27 17:14:18.589869629 +0000 UTC m=+0.061733986 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:14:18 compute-0 NetworkManager[56537]: <info>  [1772212458.5943] device (tapd12ccaa4-20): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:14:18 compute-0 NetworkManager[56537]: <info>  [1772212458.5953] device (tapd12ccaa4-20): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.608 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb42e20-a623-4309-a90e-723bde534405]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.613 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[b5a7b66c-71ae-4d90-98a5-09c40cde0d78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 systemd-udevd[216623]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:14:18 compute-0 NetworkManager[56537]: <info>  [1772212458.6140] manager: (tap855ecfce-40): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.638 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[069aa26c-ad34-4758-8c12-fbc9f2cec990]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.644 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[cb057918-1dde-4c2b-891c-051c1fbb7abb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 NetworkManager[56537]: <info>  [1772212458.6551] device (tap855ecfce-40): carrier: link connected
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.656 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[cf54c0bb-574f-4211-a999-e8d5e62972f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.669 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[8d463fc3-af62-4bec-b981-cd2db59370e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap855ecfce-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:8b:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333414, 'reachable_time': 33402, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216650, 'error': None, 'target': 'ovnmeta-855ecfce-44da-448e-8c49-042beea13b27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.680 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[a9f21057-9a92-4120-8d02-c3ff1416e2dc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:8bcb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 333414, 'tstamp': 333414}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216651, 'error': None, 'target': 'ovnmeta-855ecfce-44da-448e-8c49-042beea13b27', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.691 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[b83faf5e-eb5b-4138-9dff-a928a938499c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap855ecfce-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:8b:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333414, 'reachable_time': 33402, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216652, 'error': None, 'target': 'ovnmeta-855ecfce-44da-448e-8c49-042beea13b27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.714 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3f013f-4d09-4317-ad9a-65da6dc9849b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.763 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[36b23be9-6572-476c-8982-b8d4745f45c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.764 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap855ecfce-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.765 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.765 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap855ecfce-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.801 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:18 compute-0 NetworkManager[56537]: <info>  [1772212458.8027] manager: (tap855ecfce-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Feb 27 17:14:18 compute-0 kernel: tap855ecfce-40: entered promiscuous mode
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.805 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.807 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap855ecfce-40, col_values=(('external_ids', {'iface-id': 'bb4d67da-b4b6-489e-9c40-4e331d346267'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.808 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:18 compute-0 ovn_controller[96756]: 2026-02-27T17:14:18Z|00050|binding|INFO|Releasing lport bb4d67da-b4b6-489e-9c40-4e331d346267 from this chassis (sb_readonly=0)
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.809 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.811 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/855ecfce-44da-448e-8c49-042beea13b27.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/855ecfce-44da-448e-8c49-042beea13b27.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.812 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[578018cb-4584-4484-b1e6-81af861d4a16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.812 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-855ecfce-44da-448e-8c49-042beea13b27
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/855ecfce-44da-448e-8c49-042beea13b27.pid.haproxy
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID 855ecfce-44da-448e-8c49-042beea13b27
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:14:18 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:18.813 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-855ecfce-44da-448e-8c49-042beea13b27', 'env', 'PROCESS_TAG=haproxy-855ecfce-44da-448e-8c49-042beea13b27', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/855ecfce-44da-448e-8c49-042beea13b27.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.815 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.877 186844 DEBUG nova.compute.manager [req-aaf880a0-737f-46f8-b2b9-87659345c78d req-6dabcd76-a05a-4743-b8db-d5168ed6b96d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-vif-plugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.878 186844 DEBUG oslo_concurrency.lockutils [req-aaf880a0-737f-46f8-b2b9-87659345c78d req-6dabcd76-a05a-4743-b8db-d5168ed6b96d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.878 186844 DEBUG oslo_concurrency.lockutils [req-aaf880a0-737f-46f8-b2b9-87659345c78d req-6dabcd76-a05a-4743-b8db-d5168ed6b96d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.879 186844 DEBUG oslo_concurrency.lockutils [req-aaf880a0-737f-46f8-b2b9-87659345c78d req-6dabcd76-a05a-4743-b8db-d5168ed6b96d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.879 186844 DEBUG nova.compute.manager [req-aaf880a0-737f-46f8-b2b9-87659345c78d req-6dabcd76-a05a-4743-b8db-d5168ed6b96d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Processing event network-vif-plugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.927 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212458.9270525, 32210ed6-c54d-46f8-8f4c-28d3ebd66edb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.928 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] VM Started (Lifecycle Event)
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.931 186844 DEBUG nova.compute.manager [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.936 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.939 186844 INFO nova.virt.libvirt.driver [-] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Instance spawned successfully.
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.940 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.953 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.959 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.963 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.963 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.964 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.964 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.965 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.966 186844 DEBUG nova.virt.libvirt.driver [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.974 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.975 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212458.928148, 32210ed6-c54d-46f8-8f4c-28d3ebd66edb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.975 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] VM Paused (Lifecycle Event)
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.994 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.998 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212458.937551, 32210ed6-c54d-46f8-8f4c-28d3ebd66edb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:14:18 compute-0 nova_compute[186840]: 2026-02-27 17:14:18.999 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] VM Resumed (Lifecycle Event)
Feb 27 17:14:19 compute-0 nova_compute[186840]: 2026-02-27 17:14:19.019 186844 INFO nova.compute.manager [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Took 5.98 seconds to spawn the instance on the hypervisor.
Feb 27 17:14:19 compute-0 nova_compute[186840]: 2026-02-27 17:14:19.019 186844 DEBUG nova.compute.manager [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:14:19 compute-0 nova_compute[186840]: 2026-02-27 17:14:19.021 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:14:19 compute-0 nova_compute[186840]: 2026-02-27 17:14:19.029 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:14:19 compute-0 nova_compute[186840]: 2026-02-27 17:14:19.064 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:14:19 compute-0 nova_compute[186840]: 2026-02-27 17:14:19.096 186844 INFO nova.compute.manager [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Took 6.68 seconds to build instance.
Feb 27 17:14:19 compute-0 nova_compute[186840]: 2026-02-27 17:14:19.121 186844 DEBUG oslo_concurrency.lockutils [None req-9af67d07-026b-48da-920e-cbd8da8d9ba9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.782s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:19 compute-0 nova_compute[186840]: 2026-02-27 17:14:19.187 186844 DEBUG nova.network.neutron [req-33d2ad60-65c8-4032-95ce-4f7869aef20a req-69569013-4102-4090-93e5-37b294e6a5df 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Updated VIF entry in instance network info cache for port d12ccaa4-2084-47aa-9d33-a6fe5a03b378. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:14:19 compute-0 nova_compute[186840]: 2026-02-27 17:14:19.188 186844 DEBUG nova.network.neutron [req-33d2ad60-65c8-4032-95ce-4f7869aef20a req-69569013-4102-4090-93e5-37b294e6a5df 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Updating instance_info_cache with network_info: [{"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:14:19 compute-0 nova_compute[186840]: 2026-02-27 17:14:19.215 186844 DEBUG oslo_concurrency.lockutils [req-33d2ad60-65c8-4032-95ce-4f7869aef20a req-69569013-4102-4090-93e5-37b294e6a5df 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:14:19 compute-0 podman[216691]: 2026-02-27 17:14:19.181333287 +0000 UTC m=+0.034699152 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:14:19 compute-0 podman[216691]: 2026-02-27 17:14:19.627519917 +0000 UTC m=+0.480885792 container create ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:14:19 compute-0 systemd[1]: Started libpod-conmon-ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a.scope.
Feb 27 17:14:19 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:14:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7cf61b50cf760b9e5de5ddc7a9ff1c278423a1e59ce3497577d7787c88a0746e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:14:19 compute-0 podman[216691]: 2026-02-27 17:14:19.853464244 +0000 UTC m=+0.706830119 container init ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:14:19 compute-0 podman[216691]: 2026-02-27 17:14:19.858596584 +0000 UTC m=+0.711962429 container start ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 27 17:14:19 compute-0 neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27[216707]: [NOTICE]   (216711) : New worker (216713) forked
Feb 27 17:14:19 compute-0 neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27[216707]: [NOTICE]   (216711) : Loading success.
Feb 27 17:14:20 compute-0 nova_compute[186840]: 2026-02-27 17:14:20.282 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:21 compute-0 nova_compute[186840]: 2026-02-27 17:14:21.018 186844 DEBUG nova.compute.manager [req-386ad70a-bb0d-4160-8d81-f8edfb1d8cc9 req-d1254642-90eb-4b90-98dd-dd4938eb25cf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-vif-plugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:14:21 compute-0 nova_compute[186840]: 2026-02-27 17:14:21.019 186844 DEBUG oslo_concurrency.lockutils [req-386ad70a-bb0d-4160-8d81-f8edfb1d8cc9 req-d1254642-90eb-4b90-98dd-dd4938eb25cf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:21 compute-0 nova_compute[186840]: 2026-02-27 17:14:21.019 186844 DEBUG oslo_concurrency.lockutils [req-386ad70a-bb0d-4160-8d81-f8edfb1d8cc9 req-d1254642-90eb-4b90-98dd-dd4938eb25cf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:21 compute-0 nova_compute[186840]: 2026-02-27 17:14:21.020 186844 DEBUG oslo_concurrency.lockutils [req-386ad70a-bb0d-4160-8d81-f8edfb1d8cc9 req-d1254642-90eb-4b90-98dd-dd4938eb25cf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:21 compute-0 nova_compute[186840]: 2026-02-27 17:14:21.020 186844 DEBUG nova.compute.manager [req-386ad70a-bb0d-4160-8d81-f8edfb1d8cc9 req-d1254642-90eb-4b90-98dd-dd4938eb25cf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] No waiting events found dispatching network-vif-plugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:14:21 compute-0 nova_compute[186840]: 2026-02-27 17:14:21.021 186844 WARNING nova.compute.manager [req-386ad70a-bb0d-4160-8d81-f8edfb1d8cc9 req-d1254642-90eb-4b90-98dd-dd4938eb25cf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received unexpected event network-vif-plugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 for instance with vm_state active and task_state None.
Feb 27 17:14:21 compute-0 nova_compute[186840]: 2026-02-27 17:14:21.886 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:22 compute-0 ovn_controller[96756]: 2026-02-27T17:14:22Z|00051|binding|INFO|Releasing lport bb4d67da-b4b6-489e-9c40-4e331d346267 from this chassis (sb_readonly=0)
Feb 27 17:14:22 compute-0 nova_compute[186840]: 2026-02-27 17:14:22.494 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:22 compute-0 NetworkManager[56537]: <info>  [1772212462.4951] manager: (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Feb 27 17:14:22 compute-0 NetworkManager[56537]: <info>  [1772212462.4963] manager: (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Feb 27 17:14:22 compute-0 ovn_controller[96756]: 2026-02-27T17:14:22Z|00052|binding|INFO|Releasing lport bb4d67da-b4b6-489e-9c40-4e331d346267 from this chassis (sb_readonly=0)
Feb 27 17:14:22 compute-0 nova_compute[186840]: 2026-02-27 17:14:22.503 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:22 compute-0 nova_compute[186840]: 2026-02-27 17:14:22.508 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:22 compute-0 nova_compute[186840]: 2026-02-27 17:14:22.899 186844 DEBUG nova.compute.manager [req-6250cd27-c1a6-4642-98bd-41e43ada23eb req-82ee480a-a5b2-43ea-9a2a-f9b0df78b4a6 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-changed-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:14:22 compute-0 nova_compute[186840]: 2026-02-27 17:14:22.900 186844 DEBUG nova.compute.manager [req-6250cd27-c1a6-4642-98bd-41e43ada23eb req-82ee480a-a5b2-43ea-9a2a-f9b0df78b4a6 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Refreshing instance network info cache due to event network-changed-d12ccaa4-2084-47aa-9d33-a6fe5a03b378. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:14:22 compute-0 nova_compute[186840]: 2026-02-27 17:14:22.901 186844 DEBUG oslo_concurrency.lockutils [req-6250cd27-c1a6-4642-98bd-41e43ada23eb req-82ee480a-a5b2-43ea-9a2a-f9b0df78b4a6 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:14:22 compute-0 nova_compute[186840]: 2026-02-27 17:14:22.901 186844 DEBUG oslo_concurrency.lockutils [req-6250cd27-c1a6-4642-98bd-41e43ada23eb req-82ee480a-a5b2-43ea-9a2a-f9b0df78b4a6 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:14:22 compute-0 nova_compute[186840]: 2026-02-27 17:14:22.902 186844 DEBUG nova.network.neutron [req-6250cd27-c1a6-4642-98bd-41e43ada23eb req-82ee480a-a5b2-43ea-9a2a-f9b0df78b4a6 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Refreshing network info cache for port d12ccaa4-2084-47aa-9d33-a6fe5a03b378 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:14:24 compute-0 nova_compute[186840]: 2026-02-27 17:14:24.636 186844 DEBUG nova.network.neutron [req-6250cd27-c1a6-4642-98bd-41e43ada23eb req-82ee480a-a5b2-43ea-9a2a-f9b0df78b4a6 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Updated VIF entry in instance network info cache for port d12ccaa4-2084-47aa-9d33-a6fe5a03b378. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:14:24 compute-0 nova_compute[186840]: 2026-02-27 17:14:24.637 186844 DEBUG nova.network.neutron [req-6250cd27-c1a6-4642-98bd-41e43ada23eb req-82ee480a-a5b2-43ea-9a2a-f9b0df78b4a6 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Updating instance_info_cache with network_info: [{"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:14:24 compute-0 nova_compute[186840]: 2026-02-27 17:14:24.673 186844 DEBUG oslo_concurrency.lockutils [req-6250cd27-c1a6-4642-98bd-41e43ada23eb req-82ee480a-a5b2-43ea-9a2a-f9b0df78b4a6 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:14:25 compute-0 nova_compute[186840]: 2026-02-27 17:14:25.285 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:26 compute-0 nova_compute[186840]: 2026-02-27 17:14:26.890 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:30 compute-0 nova_compute[186840]: 2026-02-27 17:14:30.286 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:30 compute-0 podman[216726]: 2026-02-27 17:14:30.706290488 +0000 UTC m=+0.100881991 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:14:31 compute-0 nova_compute[186840]: 2026-02-27 17:14:31.893 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:32 compute-0 ovn_controller[96756]: 2026-02-27T17:14:32Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:25:bc 10.100.0.12
Feb 27 17:14:32 compute-0 ovn_controller[96756]: 2026-02-27T17:14:32Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:25:bc 10.100.0.12
Feb 27 17:14:33 compute-0 podman[216754]: 2026-02-27 17:14:33.689914366 +0000 UTC m=+0.092883334 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:14:35 compute-0 nova_compute[186840]: 2026-02-27 17:14:35.289 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:36 compute-0 podman[216773]: 2026-02-27 17:14:36.702201084 +0000 UTC m=+0.110356393 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:14:36 compute-0 nova_compute[186840]: 2026-02-27 17:14:36.896 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:39 compute-0 nova_compute[186840]: 2026-02-27 17:14:39.623 186844 INFO nova.compute.manager [None req-75bff1fb-ac14-49ca-8418-f58c9985a930 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Get console output
Feb 27 17:14:39 compute-0 nova_compute[186840]: 2026-02-27 17:14:39.630 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:14:39 compute-0 podman[216797]: 2026-02-27 17:14:39.667634278 +0000 UTC m=+0.073125482 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 27 17:14:40 compute-0 nova_compute[186840]: 2026-02-27 17:14:40.293 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:41 compute-0 nova_compute[186840]: 2026-02-27 17:14:41.899 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:44 compute-0 podman[216819]: 2026-02-27 17:14:44.666804675 +0000 UTC m=+0.073966967 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 27 17:14:45 compute-0 nova_compute[186840]: 2026-02-27 17:14:45.295 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:46 compute-0 nova_compute[186840]: 2026-02-27 17:14:46.903 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:47.089 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:47.090 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:47.090 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:48 compute-0 nova_compute[186840]: 2026-02-27 17:14:48.546 186844 DEBUG oslo_concurrency.lockutils [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "interface-32210ed6-c54d-46f8-8f4c-28d3ebd66edb-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:48 compute-0 nova_compute[186840]: 2026-02-27 17:14:48.547 186844 DEBUG oslo_concurrency.lockutils [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "interface-32210ed6-c54d-46f8-8f4c-28d3ebd66edb-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:48 compute-0 nova_compute[186840]: 2026-02-27 17:14:48.548 186844 DEBUG nova.objects.instance [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'flavor' on Instance uuid 32210ed6-c54d-46f8-8f4c-28d3ebd66edb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:14:49 compute-0 nova_compute[186840]: 2026-02-27 17:14:49.378 186844 DEBUG nova.objects.instance [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_requests' on Instance uuid 32210ed6-c54d-46f8-8f4c-28d3ebd66edb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:14:49 compute-0 nova_compute[186840]: 2026-02-27 17:14:49.400 186844 DEBUG nova.network.neutron [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:14:49 compute-0 nova_compute[186840]: 2026-02-27 17:14:49.603 186844 DEBUG nova.policy [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:14:49 compute-0 podman[216839]: 2026-02-27 17:14:49.670356394 +0000 UTC m=+0.069959930 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:14:50 compute-0 nova_compute[186840]: 2026-02-27 17:14:50.296 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:50 compute-0 nova_compute[186840]: 2026-02-27 17:14:50.430 186844 DEBUG nova.network.neutron [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Successfully created port: 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:14:51 compute-0 nova_compute[186840]: 2026-02-27 17:14:51.396 186844 DEBUG nova.network.neutron [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Successfully updated port: 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:14:51 compute-0 nova_compute[186840]: 2026-02-27 17:14:51.420 186844 DEBUG oslo_concurrency.lockutils [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:14:51 compute-0 nova_compute[186840]: 2026-02-27 17:14:51.420 186844 DEBUG oslo_concurrency.lockutils [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:14:51 compute-0 nova_compute[186840]: 2026-02-27 17:14:51.421 186844 DEBUG nova.network.neutron [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:14:51 compute-0 nova_compute[186840]: 2026-02-27 17:14:51.535 186844 DEBUG nova.compute.manager [req-000a0fd8-7de1-40ff-8296-2640778ad201 req-73a2151b-f8ae-41c4-b5aa-0fc7dd51c27c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-changed-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:14:51 compute-0 nova_compute[186840]: 2026-02-27 17:14:51.536 186844 DEBUG nova.compute.manager [req-000a0fd8-7de1-40ff-8296-2640778ad201 req-73a2151b-f8ae-41c4-b5aa-0fc7dd51c27c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Refreshing instance network info cache due to event network-changed-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:14:51 compute-0 nova_compute[186840]: 2026-02-27 17:14:51.536 186844 DEBUG oslo_concurrency.lockutils [req-000a0fd8-7de1-40ff-8296-2640778ad201 req-73a2151b-f8ae-41c4-b5aa-0fc7dd51c27c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:14:51 compute-0 nova_compute[186840]: 2026-02-27 17:14:51.907 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.782 186844 DEBUG nova.network.neutron [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Updating instance_info_cache with network_info: [{"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.810 186844 DEBUG oslo_concurrency.lockutils [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.812 186844 DEBUG oslo_concurrency.lockutils [req-000a0fd8-7de1-40ff-8296-2640778ad201 req-73a2151b-f8ae-41c4-b5aa-0fc7dd51c27c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.813 186844 DEBUG nova.network.neutron [req-000a0fd8-7de1-40ff-8296-2640778ad201 req-73a2151b-f8ae-41c4-b5aa-0fc7dd51c27c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Refreshing network info cache for port 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.817 186844 DEBUG nova.virt.libvirt.vif [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:14:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1925262689',display_name='tempest-TestNetworkBasicOps-server-1925262689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1925262689',id=3,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHqbE6pNlkxoO0W+b+x7Q5g9jRx5fKZTzIT5cnCj1S25nNet8TQur8wbdQf3bGJS1oI9BApghVwZr93w6YpPYwcwqUuSnPqAVK/PCzDBCyGUWFaE32AQbQNowz5d77OzIQ==',key_name='tempest-TestNetworkBasicOps-1661968032',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:14:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-43wgwng5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:14:19Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=32210ed6-c54d-46f8-8f4c-28d3ebd66edb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.818 186844 DEBUG nova.network.os_vif_util [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.819 186844 DEBUG nova.network.os_vif_util [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.820 186844 DEBUG os_vif [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.821 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.822 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.822 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.826 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.826 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1602f9d3-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.827 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1602f9d3-10, col_values=(('external_ids', {'iface-id': '1602f9d3-10f3-4bdf-a7bd-5993abecf6f2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:cc:d4', 'vm-uuid': '32210ed6-c54d-46f8-8f4c-28d3ebd66edb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:53 compute-0 NetworkManager[56537]: <info>  [1772212493.8311] manager: (tap1602f9d3-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.835 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.839 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.839 186844 INFO os_vif [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10')
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.841 186844 DEBUG nova.virt.libvirt.vif [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:14:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1925262689',display_name='tempest-TestNetworkBasicOps-server-1925262689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1925262689',id=3,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHqbE6pNlkxoO0W+b+x7Q5g9jRx5fKZTzIT5cnCj1S25nNet8TQur8wbdQf3bGJS1oI9BApghVwZr93w6YpPYwcwqUuSnPqAVK/PCzDBCyGUWFaE32AQbQNowz5d77OzIQ==',key_name='tempest-TestNetworkBasicOps-1661968032',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:14:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-43wgwng5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:14:19Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=32210ed6-c54d-46f8-8f4c-28d3ebd66edb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.841 186844 DEBUG nova.network.os_vif_util [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.842 186844 DEBUG nova.network.os_vif_util [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.847 186844 DEBUG nova.virt.libvirt.guest [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] attach device xml: <interface type="ethernet">
Feb 27 17:14:53 compute-0 nova_compute[186840]:   <mac address="fa:16:3e:1e:cc:d4"/>
Feb 27 17:14:53 compute-0 nova_compute[186840]:   <model type="virtio"/>
Feb 27 17:14:53 compute-0 nova_compute[186840]:   <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:14:53 compute-0 nova_compute[186840]:   <mtu size="1442"/>
Feb 27 17:14:53 compute-0 nova_compute[186840]:   <target dev="tap1602f9d3-10"/>
Feb 27 17:14:53 compute-0 nova_compute[186840]: </interface>
Feb 27 17:14:53 compute-0 nova_compute[186840]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Feb 27 17:14:53 compute-0 NetworkManager[56537]: <info>  [1772212493.8637] manager: (tap1602f9d3-10): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Feb 27 17:14:53 compute-0 kernel: tap1602f9d3-10: entered promiscuous mode
Feb 27 17:14:53 compute-0 ovn_controller[96756]: 2026-02-27T17:14:53Z|00053|binding|INFO|Claiming lport 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 for this chassis.
Feb 27 17:14:53 compute-0 ovn_controller[96756]: 2026-02-27T17:14:53Z|00054|binding|INFO|1602f9d3-10f3-4bdf-a7bd-5993abecf6f2: Claiming fa:16:3e:1e:cc:d4 10.100.0.25
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.867 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.871 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.883 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:cc:d4 10.100.0.25'], port_security=['fa:16:3e:1e:cc:d4 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '32210ed6-c54d-46f8-8f4c-28d3ebd66edb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1574ff8-e90c-4236-8943-cdf237bd2014', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c7cddf1d-a2b6-4e60-80f6-1d5a07563c5c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=141772b8-031c-4f0d-b1c1-6911d4809bde, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.885 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 in datapath c1574ff8-e90c-4236-8943-cdf237bd2014 bound to our chassis
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.888 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c1574ff8-e90c-4236-8943-cdf237bd2014
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.890 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:53 compute-0 ovn_controller[96756]: 2026-02-27T17:14:53Z|00055|binding|INFO|Setting lport 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 ovn-installed in OVS
Feb 27 17:14:53 compute-0 ovn_controller[96756]: 2026-02-27T17:14:53Z|00056|binding|INFO|Setting lport 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 up in Southbound
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.895 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.898 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[c29c198a-b771-46ca-ab11-81eaa1b5fc29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.900 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc1574ff8-e1 in ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.903 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc1574ff8-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.903 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[9f2d6295-3bde-4ac0-b42b-3794e4797d64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.903 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[81a3e84b-044c-4a6d-8b38-89d0ffd20576]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:53 compute-0 systemd-udevd[216870]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.917 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[e1f527a4-dc2d-454e-a966-a6f40a2d2542]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:53 compute-0 NetworkManager[56537]: <info>  [1772212493.9225] device (tap1602f9d3-10): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:14:53 compute-0 NetworkManager[56537]: <info>  [1772212493.9239] device (tap1602f9d3-10): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.933 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[182fe889-10dd-4d2d-a5d3-d0416e63085e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.955 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[bb555391-6784-4bb0-a2e3-1ff1cf8316ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:53 compute-0 systemd-udevd[216873]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.961 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c72c41-f30b-4bb0-b3ae-ea2df4327cd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:53 compute-0 NetworkManager[56537]: <info>  [1772212493.9630] manager: (tapc1574ff8-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.964 186844 DEBUG nova.virt.libvirt.driver [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.964 186844 DEBUG nova.virt.libvirt.driver [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.964 186844 DEBUG nova.virt.libvirt.driver [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:ff:25:bc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:14:53 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.964 186844 DEBUG nova.virt.libvirt.driver [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:1e:cc:d4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:14:53 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:53.996 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[a8c8f78b-2b03-4b2c-8eb5-2f302d29277c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:54 compute-0 nova_compute[186840]: 2026-02-27 17:14:53.999 186844 DEBUG nova.virt.libvirt.guest [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:14:54 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:14:54 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1925262689</nova:name>
Feb 27 17:14:54 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:14:53</nova:creationTime>
Feb 27 17:14:54 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:14:54 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:14:54 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:14:54 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:14:54 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:14:54 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:14:54 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:14:54 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:14:54 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:14:54 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:14:54 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:14:54 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:14:54 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:14:54 compute-0 nova_compute[186840]:     <nova:port uuid="d12ccaa4-2084-47aa-9d33-a6fe5a03b378">
Feb 27 17:14:54 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 27 17:14:54 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:14:54 compute-0 nova_compute[186840]:     <nova:port uuid="1602f9d3-10f3-4bdf-a7bd-5993abecf6f2">
Feb 27 17:14:54 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Feb 27 17:14:54 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:14:54 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:14:54 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:14:54 compute-0 nova_compute[186840]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.000 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[da239c05-0565-4951-8e57-4becb0c2b50a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:54 compute-0 NetworkManager[56537]: <info>  [1772212494.0236] device (tapc1574ff8-e0): carrier: link connected
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.030 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[a89d07c0-2664-4e89-a06d-af36b2b12bc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:54 compute-0 nova_compute[186840]: 2026-02-27 17:14:54.038 186844 DEBUG oslo_concurrency.lockutils [None req-f5408dfd-eab4-4d1b-b6c7-185db9184a04 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "interface-32210ed6-c54d-46f8-8f4c-28d3ebd66edb-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.491s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.051 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[fe93f0d5-5799-4a82-ac80-faa0ab9b9db7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1574ff8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:dd:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336951, 'reachable_time': 16000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216896, 'error': None, 'target': 'ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.067 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[fa38abf6-9125-49a4-8c3c-1390ba343cbf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feef:dde0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 336951, 'tstamp': 336951}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216897, 'error': None, 'target': 'ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.081 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[417f2786-4944-48f5-8208-9445486f724c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc1574ff8-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:dd:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336951, 'reachable_time': 16000, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216898, 'error': None, 'target': 'ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.109 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[1b388f50-9b26-465a-a4c9-e1345fe98cc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.183 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[e27e1f09-98ad-4a62-b1d2-d6ec462f44df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.185 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1574ff8-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.186 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.186 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1574ff8-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:54 compute-0 kernel: tapc1574ff8-e0: entered promiscuous mode
Feb 27 17:14:54 compute-0 NetworkManager[56537]: <info>  [1772212494.1898] manager: (tapc1574ff8-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Feb 27 17:14:54 compute-0 nova_compute[186840]: 2026-02-27 17:14:54.189 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.192 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc1574ff8-e0, col_values=(('external_ids', {'iface-id': 'accddece-0438-454e-a48e-f69ee045c957'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:54 compute-0 nova_compute[186840]: 2026-02-27 17:14:54.194 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:54 compute-0 ovn_controller[96756]: 2026-02-27T17:14:54Z|00057|binding|INFO|Releasing lport accddece-0438-454e-a48e-f69ee045c957 from this chassis (sb_readonly=0)
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.196 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c1574ff8-e90c-4236-8943-cdf237bd2014.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c1574ff8-e90c-4236-8943-cdf237bd2014.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.197 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[b3188bc5-46e1-4481-b9c8-1cec59c45b06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:54 compute-0 nova_compute[186840]: 2026-02-27 17:14:54.198 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.198 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-c1574ff8-e90c-4236-8943-cdf237bd2014
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/c1574ff8-e90c-4236-8943-cdf237bd2014.pid.haproxy
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID c1574ff8-e90c-4236-8943-cdf237bd2014
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:14:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:54.199 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014', 'env', 'PROCESS_TAG=haproxy-c1574ff8-e90c-4236-8943-cdf237bd2014', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c1574ff8-e90c-4236-8943-cdf237bd2014.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:14:54 compute-0 nova_compute[186840]: 2026-02-27 17:14:54.212 186844 DEBUG nova.compute.manager [req-21613a86-6957-47df-b607-7f7973a5de8e req-55ee060c-8158-4bc9-9259-e755e07a6e2f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-vif-plugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:14:54 compute-0 nova_compute[186840]: 2026-02-27 17:14:54.213 186844 DEBUG oslo_concurrency.lockutils [req-21613a86-6957-47df-b607-7f7973a5de8e req-55ee060c-8158-4bc9-9259-e755e07a6e2f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:54 compute-0 nova_compute[186840]: 2026-02-27 17:14:54.213 186844 DEBUG oslo_concurrency.lockutils [req-21613a86-6957-47df-b607-7f7973a5de8e req-55ee060c-8158-4bc9-9259-e755e07a6e2f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:54 compute-0 nova_compute[186840]: 2026-02-27 17:14:54.213 186844 DEBUG oslo_concurrency.lockutils [req-21613a86-6957-47df-b607-7f7973a5de8e req-55ee060c-8158-4bc9-9259-e755e07a6e2f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:54 compute-0 nova_compute[186840]: 2026-02-27 17:14:54.214 186844 DEBUG nova.compute.manager [req-21613a86-6957-47df-b607-7f7973a5de8e req-55ee060c-8158-4bc9-9259-e755e07a6e2f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] No waiting events found dispatching network-vif-plugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:14:54 compute-0 nova_compute[186840]: 2026-02-27 17:14:54.214 186844 WARNING nova.compute.manager [req-21613a86-6957-47df-b607-7f7973a5de8e req-55ee060c-8158-4bc9-9259-e755e07a6e2f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received unexpected event network-vif-plugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 for instance with vm_state active and task_state None.
Feb 27 17:14:54 compute-0 podman[216930]: 2026-02-27 17:14:54.57520553 +0000 UTC m=+0.074114631 container create 3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:14:54 compute-0 systemd[1]: Started libpod-conmon-3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e.scope.
Feb 27 17:14:54 compute-0 podman[216930]: 2026-02-27 17:14:54.53611307 +0000 UTC m=+0.035022271 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:14:54 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:14:54 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69e53cffe582f25a5c867dd2e396476426c94a7a4f2590b83a638d668fc87be7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:14:54 compute-0 podman[216930]: 2026-02-27 17:14:54.652937737 +0000 UTC m=+0.151846838 container init 3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223)
Feb 27 17:14:54 compute-0 podman[216930]: 2026-02-27 17:14:54.662759645 +0000 UTC m=+0.161668766 container start 3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260223)
Feb 27 17:14:54 compute-0 neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014[216946]: [NOTICE]   (216950) : New worker (216952) forked
Feb 27 17:14:54 compute-0 neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014[216946]: [NOTICE]   (216950) : Loading success.
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.299 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.353 186844 DEBUG nova.network.neutron [req-000a0fd8-7de1-40ff-8296-2640778ad201 req-73a2151b-f8ae-41c4-b5aa-0fc7dd51c27c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Updated VIF entry in instance network info cache for port 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.354 186844 DEBUG nova.network.neutron [req-000a0fd8-7de1-40ff-8296-2640778ad201 req-73a2151b-f8ae-41c4-b5aa-0fc7dd51c27c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Updating instance_info_cache with network_info: [{"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.376 186844 DEBUG oslo_concurrency.lockutils [req-000a0fd8-7de1-40ff-8296-2640778ad201 req-73a2151b-f8ae-41c4-b5aa-0fc7dd51c27c 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.780 186844 DEBUG oslo_concurrency.lockutils [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "interface-32210ed6-c54d-46f8-8f4c-28d3ebd66edb-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.780 186844 DEBUG oslo_concurrency.lockutils [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "interface-32210ed6-c54d-46f8-8f4c-28d3ebd66edb-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.802 186844 DEBUG nova.objects.instance [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'flavor' on Instance uuid 32210ed6-c54d-46f8-8f4c-28d3ebd66edb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.835 186844 DEBUG nova.virt.libvirt.vif [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:14:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1925262689',display_name='tempest-TestNetworkBasicOps-server-1925262689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1925262689',id=3,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHqbE6pNlkxoO0W+b+x7Q5g9jRx5fKZTzIT5cnCj1S25nNet8TQur8wbdQf3bGJS1oI9BApghVwZr93w6YpPYwcwqUuSnPqAVK/PCzDBCyGUWFaE32AQbQNowz5d77OzIQ==',key_name='tempest-TestNetworkBasicOps-1661968032',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:14:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-43wgwng5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:14:19Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=32210ed6-c54d-46f8-8f4c-28d3ebd66edb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.835 186844 DEBUG nova.network.os_vif_util [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.836 186844 DEBUG nova.network.os_vif_util [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.838 186844 DEBUG nova.virt.libvirt.guest [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:cc:d4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1602f9d3-10"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.840 186844 DEBUG nova.virt.libvirt.guest [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:cc:d4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1602f9d3-10"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.842 186844 DEBUG nova.virt.libvirt.driver [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Attempting to detach device tap1602f9d3-10 from instance 32210ed6-c54d-46f8-8f4c-28d3ebd66edb from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.843 186844 DEBUG nova.virt.libvirt.guest [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] detach device xml: <interface type="ethernet">
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <mac address="fa:16:3e:1e:cc:d4"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <model type="virtio"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <mtu size="1442"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <target dev="tap1602f9d3-10"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]: </interface>
Feb 27 17:14:55 compute-0 nova_compute[186840]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.849 186844 DEBUG nova.virt.libvirt.guest [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:cc:d4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1602f9d3-10"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.854 186844 DEBUG nova.virt.libvirt.guest [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:1e:cc:d4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1602f9d3-10"/></interface>not found in domain: <domain type='kvm' id='3'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <name>instance-00000003</name>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <uuid>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</uuid>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1925262689</nova:name>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:14:53</nova:creationTime>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:port uuid="d12ccaa4-2084-47aa-9d33-a6fe5a03b378">
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:port uuid="1602f9d3-10f3-4bdf-a7bd-5993abecf6f2">
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:14:55 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <memory unit='KiB'>131072</memory>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <vcpu placement='static'>1</vcpu>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <resource>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <partition>/machine</partition>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </resource>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <sysinfo type='smbios'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <system>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='manufacturer'>RDO</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='product'>OpenStack Compute</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='version'>27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='serial'>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='uuid'>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='family'>Virtual Machine</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </system>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <os>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <boot dev='hd'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <smbios mode='sysinfo'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </os>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <features>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <vmcoreinfo state='on'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </features>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <cpu mode='custom' match='exact' check='full'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <vendor>AMD</vendor>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='x2apic'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc-deadline'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='hypervisor'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc_adjust'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='spec-ctrl'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='stibp'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='ssbd'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='cmp_legacy'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='overflow-recov'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='succor'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='ibrs'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='amd-ssbd'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='virt-ssbd'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='lbrv'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='tsc-scale'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='vmcb-clean'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='flushbyasid'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='pause-filter'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='pfthreshold'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='xsaves'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='svm'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='topoext'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='npt'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='nrip-save'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <clock offset='utc'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <timer name='pit' tickpolicy='delay'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <timer name='hpet' present='no'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <on_poweroff>destroy</on_poweroff>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <on_reboot>restart</on_reboot>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <on_crash>destroy</on_crash>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <disk type='file' device='disk'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk' index='2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <backingStore type='file' index='3'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:         <format type='raw'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:         <source file='/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:         <backingStore/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       </backingStore>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target dev='vda' bus='virtio'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='virtio-disk0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <disk type='file' device='cdrom'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <driver name='qemu' type='raw' cache='none'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk.config' index='1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <backingStore/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target dev='sda' bus='sata'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <readonly/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='sata0-0-0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='0' model='pcie-root'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pcie.0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='1' port='0x10'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='2' port='0x11'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='3' port='0x12'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.3'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='4' port='0x13'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.4'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='5' port='0x14'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.5'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='6' port='0x15'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.6'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='7' port='0x16'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.7'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='8' port='0x17'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.8'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='9' port='0x18'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.9'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='10' port='0x19'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.10'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='11' port='0x1a'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.11'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='12' port='0x1b'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.12'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='13' port='0x1c'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.13'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='14' port='0x1d'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.14'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='15' port='0x1e'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.15'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='16' port='0x1f'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.16'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='17' port='0x20'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.17'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='18' port='0x21'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.18'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='19' port='0x22'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.19'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='20' port='0x23'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.20'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='21' port='0x24'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.21'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='22' port='0x25'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.22'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='23' port='0x26'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.23'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='24' port='0x27'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.24'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='25' port='0x28'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.25'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-pci-bridge'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.26'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='usb'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='sata' index='0'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='ide'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <interface type='ethernet'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <mac address='fa:16:3e:ff:25:bc'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target dev='tapd12ccaa4-20'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model type='virtio'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <driver name='vhost' rx_queue_size='512'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <mtu size='1442'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='net0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <interface type='ethernet'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <mac address='fa:16:3e:1e:cc:d4'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target dev='tap1602f9d3-10'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model type='virtio'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <driver name='vhost' rx_queue_size='512'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <mtu size='1442'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='net1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <serial type='pty'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/console.log' append='off'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target type='isa-serial' port='0'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:         <model name='isa-serial'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       </target>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <console type='pty' tty='/dev/pts/0'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/console.log' append='off'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target type='serial' port='0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </console>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <input type='tablet' bus='usb'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='input0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='usb' bus='0' port='1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <input type='mouse' bus='ps2'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='input1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <input type='keyboard' bus='ps2'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='input2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <listen type='address' address='::0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <audio id='1' type='none'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <video>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model type='virtio' heads='1' primary='yes'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='video0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </video>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <watchdog model='itco' action='reset'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='watchdog0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </watchdog>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <memballoon model='virtio'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <stats period='10'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='balloon0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <rng model='virtio'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <backend model='random'>/dev/urandom</backend>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='rng0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <label>system_u:system_r:svirt_t:s0:c629,c814</label>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c629,c814</imagelabel>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <label>+107:+107</label>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <imagelabel>+107:+107</imagelabel>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:14:55 compute-0 nova_compute[186840]: </domain>
Feb 27 17:14:55 compute-0 nova_compute[186840]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.855 186844 INFO nova.virt.libvirt.driver [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully detached device tap1602f9d3-10 from instance 32210ed6-c54d-46f8-8f4c-28d3ebd66edb from the persistent domain config.
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.855 186844 DEBUG nova.virt.libvirt.driver [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] (1/8): Attempting to detach device tap1602f9d3-10 with device alias net1 from instance 32210ed6-c54d-46f8-8f4c-28d3ebd66edb from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.856 186844 DEBUG nova.virt.libvirt.guest [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] detach device xml: <interface type="ethernet">
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <mac address="fa:16:3e:1e:cc:d4"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <model type="virtio"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <mtu size="1442"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <target dev="tap1602f9d3-10"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]: </interface>
Feb 27 17:14:55 compute-0 nova_compute[186840]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Feb 27 17:14:55 compute-0 kernel: tap1602f9d3-10 (unregistering): left promiscuous mode
Feb 27 17:14:55 compute-0 NetworkManager[56537]: <info>  [1772212495.9550] device (tap1602f9d3-10): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.961 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:55 compute-0 ovn_controller[96756]: 2026-02-27T17:14:55Z|00058|binding|INFO|Releasing lport 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 from this chassis (sb_readonly=0)
Feb 27 17:14:55 compute-0 ovn_controller[96756]: 2026-02-27T17:14:55Z|00059|binding|INFO|Setting lport 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 down in Southbound
Feb 27 17:14:55 compute-0 ovn_controller[96756]: 2026-02-27T17:14:55Z|00060|binding|INFO|Removing iface tap1602f9d3-10 ovn-installed in OVS
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.963 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.967 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.969 186844 DEBUG nova.virt.libvirt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Received event <DeviceRemovedEvent: 1772212495.969449, 32210ed6-c54d-46f8-8f4c-28d3ebd66edb => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Feb 27 17:14:55 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:55.969 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:cc:d4 10.100.0.25'], port_security=['fa:16:3e:1e:cc:d4 10.100.0.25'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.25/28', 'neutron:device_id': '32210ed6-c54d-46f8-8f4c-28d3ebd66edb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c1574ff8-e90c-4236-8943-cdf237bd2014', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c7cddf1d-a2b6-4e60-80f6-1d5a07563c5c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=141772b8-031c-4f0d-b1c1-6911d4809bde, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.970 186844 DEBUG nova.virt.libvirt.driver [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Start waiting for the detach event from libvirt for device tap1602f9d3-10 with device alias net1 for instance 32210ed6-c54d-46f8-8f4c-28d3ebd66edb _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.971 186844 DEBUG nova.virt.libvirt.guest [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:cc:d4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1602f9d3-10"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:14:55 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:55.971 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 in datapath c1574ff8-e90c-4236-8943-cdf237bd2014 unbound from our chassis
Feb 27 17:14:55 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:55.972 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c1574ff8-e90c-4236-8943-cdf237bd2014, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:14:55 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:55.973 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[762a9e64-8671-447a-b088-6e25655d2eb6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:55 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:55.973 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014 namespace which is not needed anymore
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.974 186844 DEBUG nova.virt.libvirt.guest [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:1e:cc:d4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1602f9d3-10"/></interface>not found in domain: <domain type='kvm' id='3'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <name>instance-00000003</name>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <uuid>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</uuid>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1925262689</nova:name>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:14:53</nova:creationTime>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:port uuid="d12ccaa4-2084-47aa-9d33-a6fe5a03b378">
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:port uuid="1602f9d3-10f3-4bdf-a7bd-5993abecf6f2">
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.25" ipVersion="4"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:14:55 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <memory unit='KiB'>131072</memory>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <vcpu placement='static'>1</vcpu>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <resource>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <partition>/machine</partition>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </resource>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <sysinfo type='smbios'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <system>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='manufacturer'>RDO</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='product'>OpenStack Compute</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='version'>27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='serial'>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='uuid'>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <entry name='family'>Virtual Machine</entry>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </system>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <os>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <boot dev='hd'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <smbios mode='sysinfo'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </os>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <features>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <vmcoreinfo state='on'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </features>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <cpu mode='custom' match='exact' check='full'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <vendor>AMD</vendor>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='x2apic'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc-deadline'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='hypervisor'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc_adjust'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='spec-ctrl'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='stibp'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='ssbd'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='cmp_legacy'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='overflow-recov'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='succor'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='ibrs'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='amd-ssbd'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='virt-ssbd'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='lbrv'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='tsc-scale'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='vmcb-clean'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='flushbyasid'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='pause-filter'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='pfthreshold'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='xsaves'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='svm'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='require' name='topoext'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='npt'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <feature policy='disable' name='nrip-save'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <clock offset='utc'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <timer name='pit' tickpolicy='delay'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <timer name='hpet' present='no'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <on_poweroff>destroy</on_poweroff>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <on_reboot>restart</on_reboot>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <on_crash>destroy</on_crash>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <disk type='file' device='disk'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk' index='2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <backingStore type='file' index='3'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:         <format type='raw'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:         <source file='/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:         <backingStore/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       </backingStore>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target dev='vda' bus='virtio'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='virtio-disk0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <disk type='file' device='cdrom'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <driver name='qemu' type='raw' cache='none'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk.config' index='1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <backingStore/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target dev='sda' bus='sata'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <readonly/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='sata0-0-0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='0' model='pcie-root'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pcie.0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='1' port='0x10'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='2' port='0x11'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='3' port='0x12'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.3'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='4' port='0x13'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.4'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='5' port='0x14'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.5'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='6' port='0x15'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.6'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='7' port='0x16'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.7'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='8' port='0x17'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.8'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='9' port='0x18'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.9'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='10' port='0x19'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.10'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='11' port='0x1a'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.11'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='12' port='0x1b'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.12'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='13' port='0x1c'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.13'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='14' port='0x1d'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.14'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='15' port='0x1e'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.15'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='16' port='0x1f'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.16'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='17' port='0x20'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.17'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='18' port='0x21'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.18'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='19' port='0x22'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.19'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='20' port='0x23'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.20'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='21' port='0x24'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.21'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='22' port='0x25'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.22'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='23' port='0x26'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.23'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='24' port='0x27'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.24'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target chassis='25' port='0x28'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.25'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model name='pcie-pci-bridge'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='pci.26'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='usb'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <controller type='sata' index='0'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='ide'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <interface type='ethernet'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <mac address='fa:16:3e:ff:25:bc'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target dev='tapd12ccaa4-20'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model type='virtio'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <driver name='vhost' rx_queue_size='512'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <mtu size='1442'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='net0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <serial type='pty'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/console.log' append='off'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target type='isa-serial' port='0'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:         <model name='isa-serial'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       </target>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <console type='pty' tty='/dev/pts/0'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/console.log' append='off'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <target type='serial' port='0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </console>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <input type='tablet' bus='usb'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='input0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='usb' bus='0' port='1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <input type='mouse' bus='ps2'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='input1'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <input type='keyboard' bus='ps2'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='input2'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <listen type='address' address='::0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <audio id='1' type='none'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <video>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <model type='virtio' heads='1' primary='yes'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='video0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </video>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <watchdog model='itco' action='reset'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='watchdog0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </watchdog>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <memballoon model='virtio'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <stats period='10'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='balloon0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <rng model='virtio'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <backend model='random'>/dev/urandom</backend>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <alias name='rng0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <label>system_u:system_r:svirt_t:s0:c629,c814</label>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c629,c814</imagelabel>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <label>+107:+107</label>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <imagelabel>+107:+107</imagelabel>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:14:55 compute-0 nova_compute[186840]: </domain>
Feb 27 17:14:55 compute-0 nova_compute[186840]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.974 186844 INFO nova.virt.libvirt.driver [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully detached device tap1602f9d3-10 from instance 32210ed6-c54d-46f8-8f4c-28d3ebd66edb from the live domain config.
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.975 186844 DEBUG nova.virt.libvirt.vif [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:14:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1925262689',display_name='tempest-TestNetworkBasicOps-server-1925262689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1925262689',id=3,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHqbE6pNlkxoO0W+b+x7Q5g9jRx5fKZTzIT5cnCj1S25nNet8TQur8wbdQf3bGJS1oI9BApghVwZr93w6YpPYwcwqUuSnPqAVK/PCzDBCyGUWFaE32AQbQNowz5d77OzIQ==',key_name='tempest-TestNetworkBasicOps-1661968032',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:14:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-43wgwng5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:14:19Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=32210ed6-c54d-46f8-8f4c-28d3ebd66edb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.975 186844 DEBUG nova.network.os_vif_util [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.976 186844 DEBUG nova.network.os_vif_util [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.976 186844 DEBUG os_vif [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.978 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.978 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1602f9d3-10, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.980 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.982 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.984 186844 INFO os_vif [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10')
Feb 27 17:14:55 compute-0 nova_compute[186840]: 2026-02-27 17:14:55.984 186844 DEBUG nova.virt.libvirt.guest [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1925262689</nova:name>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:14:55</nova:creationTime>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     <nova:port uuid="d12ccaa4-2084-47aa-9d33-a6fe5a03b378">
Feb 27 17:14:55 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 27 17:14:55 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:14:55 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:14:55 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:14:55 compute-0 nova_compute[186840]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 27 17:14:56 compute-0 neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014[216946]: [NOTICE]   (216950) : haproxy version is 2.8.14-c23fe91
Feb 27 17:14:56 compute-0 neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014[216946]: [NOTICE]   (216950) : path to executable is /usr/sbin/haproxy
Feb 27 17:14:56 compute-0 neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014[216946]: [WARNING]  (216950) : Exiting Master process...
Feb 27 17:14:56 compute-0 neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014[216946]: [ALERT]    (216950) : Current worker (216952) exited with code 143 (Terminated)
Feb 27 17:14:56 compute-0 neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014[216946]: [WARNING]  (216950) : All workers exited. Exiting... (0)
Feb 27 17:14:56 compute-0 systemd[1]: libpod-3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e.scope: Deactivated successfully.
Feb 27 17:14:56 compute-0 podman[216983]: 2026-02-27 17:14:56.089624098 +0000 UTC m=+0.047842572 container died 3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:14:56 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e-userdata-shm.mount: Deactivated successfully.
Feb 27 17:14:56 compute-0 systemd[1]: var-lib-containers-storage-overlay-69e53cffe582f25a5c867dd2e396476426c94a7a4f2590b83a638d668fc87be7-merged.mount: Deactivated successfully.
Feb 27 17:14:56 compute-0 podman[216983]: 2026-02-27 17:14:56.134061927 +0000 UTC m=+0.092280361 container cleanup 3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0)
Feb 27 17:14:56 compute-0 systemd[1]: libpod-conmon-3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e.scope: Deactivated successfully.
Feb 27 17:14:56 compute-0 podman[217014]: 2026-02-27 17:14:56.190438085 +0000 UTC m=+0.040366950 container remove 3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:14:56 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:56.193 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[8681826f-cb49-41fd-bcc5-ccb455d0f43a]: (4, ('Fri Feb 27 05:14:56 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014 (3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e)\n3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e\nFri Feb 27 05:14:56 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014 (3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e)\n3dcbdf0c11684bee364019d04788856eab50cd7a3643a50443b4689ca691456e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:56 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:56.195 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[dfed2863-6434-4c4c-88fa-3ecb74b155a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:56 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:56.196 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1574ff8-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.198 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:56 compute-0 kernel: tapc1574ff8-e0: left promiscuous mode
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.204 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:56 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:56.206 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3ca0b70c-865b-495e-9abf-f114d268b7ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:56 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:56.229 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[54a29091-5d68-461f-ba1a-5551ab4bd140]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:56 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:56.230 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[13cfa8f5-9f4b-4301-9d16-c614821ce6c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:56 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:56.241 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[4e983c88-5031-47a4-a570-2f95d844e756]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 336944, 'reachable_time': 29750, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217030, 'error': None, 'target': 'ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:56 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:56.243 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c1574ff8-e90c-4236-8943-cdf237bd2014 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:14:56 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:56.243 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d9e15a-f8a4-423c-95e1-51ca6a5ff57c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:56 compute-0 systemd[1]: run-netns-ovnmeta\x2dc1574ff8\x2de90c\x2d4236\x2d8943\x2dcdf237bd2014.mount: Deactivated successfully.
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.338 186844 DEBUG nova.compute.manager [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-vif-plugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.339 186844 DEBUG oslo_concurrency.lockutils [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.339 186844 DEBUG oslo_concurrency.lockutils [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.340 186844 DEBUG oslo_concurrency.lockutils [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.340 186844 DEBUG nova.compute.manager [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] No waiting events found dispatching network-vif-plugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.341 186844 WARNING nova.compute.manager [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received unexpected event network-vif-plugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 for instance with vm_state active and task_state None.
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.341 186844 DEBUG nova.compute.manager [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-vif-unplugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.341 186844 DEBUG oslo_concurrency.lockutils [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.342 186844 DEBUG oslo_concurrency.lockutils [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.342 186844 DEBUG oslo_concurrency.lockutils [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.342 186844 DEBUG nova.compute.manager [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] No waiting events found dispatching network-vif-unplugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.342 186844 WARNING nova.compute.manager [req-76594639-27fc-4043-b0f8-2afa8ad31034 req-6d218591-bec6-4903-89a9-8c19117ba070 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received unexpected event network-vif-unplugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 for instance with vm_state active and task_state None.
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.585 186844 DEBUG oslo_concurrency.lockutils [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.585 186844 DEBUG oslo_concurrency.lockutils [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.586 186844 DEBUG nova.network.neutron [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.663 186844 DEBUG nova.compute.manager [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-vif-deleted-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.664 186844 INFO nova.compute.manager [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Neutron deleted interface 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2; detaching it from the instance and deleting it from the info cache
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.664 186844 DEBUG nova.network.neutron [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Updating instance_info_cache with network_info: [{"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.691 186844 DEBUG nova.objects.instance [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lazy-loading 'system_metadata' on Instance uuid 32210ed6-c54d-46f8-8f4c-28d3ebd66edb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.746 186844 DEBUG nova.objects.instance [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lazy-loading 'flavor' on Instance uuid 32210ed6-c54d-46f8-8f4c-28d3ebd66edb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.786 186844 DEBUG nova.virt.libvirt.vif [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:14:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1925262689',display_name='tempest-TestNetworkBasicOps-server-1925262689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1925262689',id=3,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHqbE6pNlkxoO0W+b+x7Q5g9jRx5fKZTzIT5cnCj1S25nNet8TQur8wbdQf3bGJS1oI9BApghVwZr93w6YpPYwcwqUuSnPqAVK/PCzDBCyGUWFaE32AQbQNowz5d77OzIQ==',key_name='tempest-TestNetworkBasicOps-1661968032',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:14:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-43wgwng5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:14:19Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=32210ed6-c54d-46f8-8f4c-28d3ebd66edb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.786 186844 DEBUG nova.network.os_vif_util [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Converting VIF {"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.787 186844 DEBUG nova.network.os_vif_util [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.790 186844 DEBUG nova.virt.libvirt.guest [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:cc:d4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1602f9d3-10"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.795 186844 DEBUG nova.virt.libvirt.guest [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:1e:cc:d4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1602f9d3-10"/></interface>not found in domain: <domain type='kvm' id='3'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <name>instance-00000003</name>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <uuid>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</uuid>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1925262689</nova:name>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:14:55</nova:creationTime>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:port uuid="d12ccaa4-2084-47aa-9d33-a6fe5a03b378">
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:14:56 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <memory unit='KiB'>131072</memory>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <vcpu placement='static'>1</vcpu>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <resource>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <partition>/machine</partition>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </resource>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <sysinfo type='smbios'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <system>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='manufacturer'>RDO</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='product'>OpenStack Compute</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='version'>27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='serial'>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='uuid'>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='family'>Virtual Machine</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </system>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <os>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <boot dev='hd'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <smbios mode='sysinfo'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </os>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <features>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <vmcoreinfo state='on'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </features>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <cpu mode='custom' match='exact' check='full'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <vendor>AMD</vendor>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='x2apic'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc-deadline'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='hypervisor'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc_adjust'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='spec-ctrl'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='stibp'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='ssbd'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='cmp_legacy'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='overflow-recov'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='succor'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='ibrs'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='amd-ssbd'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='virt-ssbd'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='lbrv'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='tsc-scale'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='vmcb-clean'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='flushbyasid'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='pause-filter'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='pfthreshold'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='xsaves'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='svm'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='topoext'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='npt'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='nrip-save'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <clock offset='utc'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <timer name='pit' tickpolicy='delay'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <timer name='hpet' present='no'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <on_poweroff>destroy</on_poweroff>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <on_reboot>restart</on_reboot>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <on_crash>destroy</on_crash>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <disk type='file' device='disk'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk' index='2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <backingStore type='file' index='3'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:         <format type='raw'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:         <source file='/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:         <backingStore/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       </backingStore>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target dev='vda' bus='virtio'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='virtio-disk0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <disk type='file' device='cdrom'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <driver name='qemu' type='raw' cache='none'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk.config' index='1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <backingStore/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target dev='sda' bus='sata'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <readonly/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='sata0-0-0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='0' model='pcie-root'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pcie.0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='1' port='0x10'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='2' port='0x11'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='3' port='0x12'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.3'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='4' port='0x13'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.4'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='5' port='0x14'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.5'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='6' port='0x15'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.6'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='7' port='0x16'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.7'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='8' port='0x17'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.8'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='9' port='0x18'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.9'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='10' port='0x19'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.10'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='11' port='0x1a'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.11'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='12' port='0x1b'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.12'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='13' port='0x1c'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.13'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='14' port='0x1d'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.14'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='15' port='0x1e'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.15'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='16' port='0x1f'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.16'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='17' port='0x20'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.17'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='18' port='0x21'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.18'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='19' port='0x22'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.19'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='20' port='0x23'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.20'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='21' port='0x24'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.21'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='22' port='0x25'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.22'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='23' port='0x26'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.23'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='24' port='0x27'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.24'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='25' port='0x28'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.25'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-pci-bridge'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.26'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='usb'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='sata' index='0'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='ide'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <interface type='ethernet'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <mac address='fa:16:3e:ff:25:bc'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target dev='tapd12ccaa4-20'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model type='virtio'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <driver name='vhost' rx_queue_size='512'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <mtu size='1442'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='net0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <serial type='pty'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/console.log' append='off'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target type='isa-serial' port='0'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:         <model name='isa-serial'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       </target>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <console type='pty' tty='/dev/pts/0'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/console.log' append='off'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target type='serial' port='0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </console>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <input type='tablet' bus='usb'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='input0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='usb' bus='0' port='1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <input type='mouse' bus='ps2'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='input1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <input type='keyboard' bus='ps2'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='input2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <listen type='address' address='::0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <audio id='1' type='none'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <video>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model type='virtio' heads='1' primary='yes'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='video0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </video>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <watchdog model='itco' action='reset'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='watchdog0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </watchdog>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <memballoon model='virtio'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <stats period='10'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='balloon0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <rng model='virtio'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <backend model='random'>/dev/urandom</backend>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='rng0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <label>system_u:system_r:svirt_t:s0:c629,c814</label>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c629,c814</imagelabel>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <label>+107:+107</label>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <imagelabel>+107:+107</imagelabel>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:14:56 compute-0 nova_compute[186840]: </domain>
Feb 27 17:14:56 compute-0 nova_compute[186840]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.795 186844 DEBUG nova.virt.libvirt.guest [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:1e:cc:d4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1602f9d3-10"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.800 186844 DEBUG nova.virt.libvirt.guest [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:1e:cc:d4"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1602f9d3-10"/></interface>not found in domain: <domain type='kvm' id='3'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <name>instance-00000003</name>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <uuid>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</uuid>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1925262689</nova:name>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:14:55</nova:creationTime>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:port uuid="d12ccaa4-2084-47aa-9d33-a6fe5a03b378">
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:14:56 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <memory unit='KiB'>131072</memory>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <vcpu placement='static'>1</vcpu>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <resource>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <partition>/machine</partition>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </resource>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <sysinfo type='smbios'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <system>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='manufacturer'>RDO</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='product'>OpenStack Compute</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='version'>27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='serial'>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='uuid'>32210ed6-c54d-46f8-8f4c-28d3ebd66edb</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <entry name='family'>Virtual Machine</entry>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </system>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <os>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <boot dev='hd'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <smbios mode='sysinfo'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </os>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <features>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <vmcoreinfo state='on'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </features>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <cpu mode='custom' match='exact' check='full'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <vendor>AMD</vendor>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='x2apic'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc-deadline'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='hypervisor'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc_adjust'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='spec-ctrl'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='stibp'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='ssbd'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='cmp_legacy'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='overflow-recov'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='succor'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='ibrs'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='amd-ssbd'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='virt-ssbd'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='lbrv'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='tsc-scale'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='vmcb-clean'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='flushbyasid'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='pause-filter'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='pfthreshold'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='xsaves'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='svm'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='require' name='topoext'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='npt'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <feature policy='disable' name='nrip-save'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <clock offset='utc'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <timer name='pit' tickpolicy='delay'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <timer name='hpet' present='no'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <on_poweroff>destroy</on_poweroff>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <on_reboot>restart</on_reboot>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <on_crash>destroy</on_crash>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <disk type='file' device='disk'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk' index='2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <backingStore type='file' index='3'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:         <format type='raw'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:         <source file='/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:         <backingStore/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       </backingStore>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target dev='vda' bus='virtio'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='virtio-disk0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <disk type='file' device='cdrom'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <driver name='qemu' type='raw' cache='none'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/disk.config' index='1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <backingStore/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target dev='sda' bus='sata'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <readonly/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='sata0-0-0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='0' model='pcie-root'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pcie.0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='1' port='0x10'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='2' port='0x11'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='3' port='0x12'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.3'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='4' port='0x13'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.4'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='5' port='0x14'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.5'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='6' port='0x15'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.6'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='7' port='0x16'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.7'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='8' port='0x17'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.8'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='9' port='0x18'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.9'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='10' port='0x19'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.10'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='11' port='0x1a'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.11'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='12' port='0x1b'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.12'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='13' port='0x1c'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.13'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='14' port='0x1d'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.14'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='15' port='0x1e'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.15'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='16' port='0x1f'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.16'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='17' port='0x20'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.17'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='18' port='0x21'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.18'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='19' port='0x22'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.19'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='20' port='0x23'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.20'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='21' port='0x24'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.21'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='22' port='0x25'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.22'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='23' port='0x26'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.23'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='24' port='0x27'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.24'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target chassis='25' port='0x28'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.25'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model name='pcie-pci-bridge'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='pci.26'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='usb'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <controller type='sata' index='0'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='ide'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <interface type='ethernet'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <mac address='fa:16:3e:ff:25:bc'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target dev='tapd12ccaa4-20'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model type='virtio'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <driver name='vhost' rx_queue_size='512'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <mtu size='1442'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='net0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <serial type='pty'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/console.log' append='off'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target type='isa-serial' port='0'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:         <model name='isa-serial'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       </target>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <console type='pty' tty='/dev/pts/0'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb/console.log' append='off'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <target type='serial' port='0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </console>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <input type='tablet' bus='usb'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='input0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='usb' bus='0' port='1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <input type='mouse' bus='ps2'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='input1'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <input type='keyboard' bus='ps2'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='input2'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </input>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <listen type='address' address='::0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <audio id='1' type='none'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <video>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <model type='virtio' heads='1' primary='yes'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='video0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </video>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <watchdog model='itco' action='reset'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='watchdog0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </watchdog>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <memballoon model='virtio'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <stats period='10'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='balloon0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <rng model='virtio'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <backend model='random'>/dev/urandom</backend>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <alias name='rng0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <label>system_u:system_r:svirt_t:s0:c629,c814</label>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c629,c814</imagelabel>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <label>+107:+107</label>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <imagelabel>+107:+107</imagelabel>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:14:56 compute-0 nova_compute[186840]: </domain>
Feb 27 17:14:56 compute-0 nova_compute[186840]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.801 186844 WARNING nova.virt.libvirt.driver [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Detaching interface fa:16:3e:1e:cc:d4 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap1602f9d3-10' not found.
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.802 186844 DEBUG nova.virt.libvirt.vif [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:14:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1925262689',display_name='tempest-TestNetworkBasicOps-server-1925262689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1925262689',id=3,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHqbE6pNlkxoO0W+b+x7Q5g9jRx5fKZTzIT5cnCj1S25nNet8TQur8wbdQf3bGJS1oI9BApghVwZr93w6YpPYwcwqUuSnPqAVK/PCzDBCyGUWFaE32AQbQNowz5d77OzIQ==',key_name='tempest-TestNetworkBasicOps-1661968032',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:14:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-43wgwng5',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:14:19Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=32210ed6-c54d-46f8-8f4c-28d3ebd66edb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.802 186844 DEBUG nova.network.os_vif_util [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Converting VIF {"id": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "address": "fa:16:3e:1e:cc:d4", "network": {"id": "c1574ff8-e90c-4236-8943-cdf237bd2014", "bridge": "br-int", "label": "tempest-network-smoke--1571723672", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.25", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1602f9d3-10", "ovs_interfaceid": "1602f9d3-10f3-4bdf-a7bd-5993abecf6f2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.803 186844 DEBUG nova.network.os_vif_util [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.804 186844 DEBUG os_vif [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.806 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.806 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1602f9d3-10, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.806 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.809 186844 INFO os_vif [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:cc:d4,bridge_name='br-int',has_traffic_filtering=True,id=1602f9d3-10f3-4bdf-a7bd-5993abecf6f2,network=Network(c1574ff8-e90c-4236-8943-cdf237bd2014),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1602f9d3-10')
Feb 27 17:14:56 compute-0 nova_compute[186840]: 2026-02-27 17:14:56.810 186844 DEBUG nova.virt.libvirt.guest [req-4d07dfb9-65c2-488d-828a-2672927b89ef req-81a57a18-5cc5-4ce7-bcd5-be6af96e020d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1925262689</nova:name>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:14:56</nova:creationTime>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     <nova:port uuid="d12ccaa4-2084-47aa-9d33-a6fe5a03b378">
Feb 27 17:14:56 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 27 17:14:56 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:14:56 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:14:56 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:14:56 compute-0 nova_compute[186840]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 27 17:14:57 compute-0 ovn_controller[96756]: 2026-02-27T17:14:57Z|00061|binding|INFO|Releasing lport bb4d67da-b4b6-489e-9c40-4e331d346267 from this chassis (sb_readonly=0)
Feb 27 17:14:57 compute-0 nova_compute[186840]: 2026-02-27 17:14:57.950 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:57 compute-0 nova_compute[186840]: 2026-02-27 17:14:57.980 186844 INFO nova.network.neutron [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Port 1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Feb 27 17:14:57 compute-0 nova_compute[186840]: 2026-02-27 17:14:57.980 186844 DEBUG nova.network.neutron [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Updating instance_info_cache with network_info: [{"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.007 186844 DEBUG oslo_concurrency.lockutils [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.035 186844 DEBUG oslo_concurrency.lockutils [None req-9f98c75c-5dd7-4eb7-b640-3d7f3bd12d81 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "interface-32210ed6-c54d-46f8-8f4c-28d3ebd66edb-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 2.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.426 186844 DEBUG nova.compute.manager [req-87aa1dad-2c6d-4185-8838-8eea7d54cd0e req-a44d16e6-bd72-467a-83bf-a184a2af92e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-vif-plugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.426 186844 DEBUG oslo_concurrency.lockutils [req-87aa1dad-2c6d-4185-8838-8eea7d54cd0e req-a44d16e6-bd72-467a-83bf-a184a2af92e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.427 186844 DEBUG oslo_concurrency.lockutils [req-87aa1dad-2c6d-4185-8838-8eea7d54cd0e req-a44d16e6-bd72-467a-83bf-a184a2af92e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.427 186844 DEBUG oslo_concurrency.lockutils [req-87aa1dad-2c6d-4185-8838-8eea7d54cd0e req-a44d16e6-bd72-467a-83bf-a184a2af92e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.427 186844 DEBUG nova.compute.manager [req-87aa1dad-2c6d-4185-8838-8eea7d54cd0e req-a44d16e6-bd72-467a-83bf-a184a2af92e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] No waiting events found dispatching network-vif-plugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.428 186844 WARNING nova.compute.manager [req-87aa1dad-2c6d-4185-8838-8eea7d54cd0e req-a44d16e6-bd72-467a-83bf-a184a2af92e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received unexpected event network-vif-plugged-1602f9d3-10f3-4bdf-a7bd-5993abecf6f2 for instance with vm_state active and task_state None.
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.654 186844 DEBUG oslo_concurrency.lockutils [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.654 186844 DEBUG oslo_concurrency.lockutils [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.655 186844 DEBUG oslo_concurrency.lockutils [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.655 186844 DEBUG oslo_concurrency.lockutils [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.656 186844 DEBUG oslo_concurrency.lockutils [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.658 186844 INFO nova.compute.manager [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Terminating instance
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.660 186844 DEBUG nova.compute.manager [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:14:58 compute-0 kernel: tapd12ccaa4-20 (unregistering): left promiscuous mode
Feb 27 17:14:58 compute-0 NetworkManager[56537]: <info>  [1772212498.6828] device (tapd12ccaa4-20): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:14:58 compute-0 ovn_controller[96756]: 2026-02-27T17:14:58Z|00062|binding|INFO|Releasing lport d12ccaa4-2084-47aa-9d33-a6fe5a03b378 from this chassis (sb_readonly=0)
Feb 27 17:14:58 compute-0 ovn_controller[96756]: 2026-02-27T17:14:58Z|00063|binding|INFO|Setting lport d12ccaa4-2084-47aa-9d33-a6fe5a03b378 down in Southbound
Feb 27 17:14:58 compute-0 ovn_controller[96756]: 2026-02-27T17:14:58Z|00064|binding|INFO|Removing iface tapd12ccaa4-20 ovn-installed in OVS
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.691 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.701 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:58.702 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:25:bc 10.100.0.12'], port_security=['fa:16:3e:ff:25:bc 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '32210ed6-c54d-46f8-8f4c-28d3ebd66edb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-855ecfce-44da-448e-8c49-042beea13b27', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '06ab3f43-d57d-4eb1-a403-6b03d8fb5d78', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=554d2234-f016-4ddb-80d6-cd7df22dc480, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=d12ccaa4-2084-47aa-9d33-a6fe5a03b378) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:14:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:58.705 106085 INFO neutron.agent.ovn.metadata.agent [-] Port d12ccaa4-2084-47aa-9d33-a6fe5a03b378 in datapath 855ecfce-44da-448e-8c49-042beea13b27 unbound from our chassis
Feb 27 17:14:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:58.707 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 855ecfce-44da-448e-8c49-042beea13b27, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:14:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:58.708 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3d4ef6e8-4bf7-4997-8126-432eec943891]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:58.709 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-855ecfce-44da-448e-8c49-042beea13b27 namespace which is not needed anymore
Feb 27 17:14:58 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Feb 27 17:14:58 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 14.208s CPU time.
Feb 27 17:14:58 compute-0 systemd-machined[156136]: Machine qemu-3-instance-00000003 terminated.
Feb 27 17:14:58 compute-0 neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27[216707]: [NOTICE]   (216711) : haproxy version is 2.8.14-c23fe91
Feb 27 17:14:58 compute-0 neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27[216707]: [NOTICE]   (216711) : path to executable is /usr/sbin/haproxy
Feb 27 17:14:58 compute-0 neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27[216707]: [WARNING]  (216711) : Exiting Master process...
Feb 27 17:14:58 compute-0 neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27[216707]: [ALERT]    (216711) : Current worker (216713) exited with code 143 (Terminated)
Feb 27 17:14:58 compute-0 neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27[216707]: [WARNING]  (216711) : All workers exited. Exiting... (0)
Feb 27 17:14:58 compute-0 systemd[1]: libpod-ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a.scope: Deactivated successfully.
Feb 27 17:14:58 compute-0 podman[217057]: 2026-02-27 17:14:58.843569721 +0000 UTC m=+0.050985069 container died ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 27 17:14:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a-userdata-shm.mount: Deactivated successfully.
Feb 27 17:14:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-7cf61b50cf760b9e5de5ddc7a9ff1c278423a1e59ce3497577d7787c88a0746e-merged.mount: Deactivated successfully.
Feb 27 17:14:58 compute-0 podman[217057]: 2026-02-27 17:14:58.883229364 +0000 UTC m=+0.090644752 container cleanup ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 27 17:14:58 compute-0 systemd[1]: libpod-conmon-ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a.scope: Deactivated successfully.
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.911 186844 INFO nova.virt.libvirt.driver [-] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Instance destroyed successfully.
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.912 186844 DEBUG nova.objects.instance [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid 32210ed6-c54d-46f8-8f4c-28d3ebd66edb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.926 186844 DEBUG nova.virt.libvirt.vif [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:14:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1925262689',display_name='tempest-TestNetworkBasicOps-server-1925262689',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1925262689',id=3,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHqbE6pNlkxoO0W+b+x7Q5g9jRx5fKZTzIT5cnCj1S25nNet8TQur8wbdQf3bGJS1oI9BApghVwZr93w6YpPYwcwqUuSnPqAVK/PCzDBCyGUWFaE32AQbQNowz5d77OzIQ==',key_name='tempest-TestNetworkBasicOps-1661968032',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:14:19Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-43wgwng5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:14:19Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=32210ed6-c54d-46f8-8f4c-28d3ebd66edb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.926 186844 DEBUG nova.network.os_vif_util [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "address": "fa:16:3e:ff:25:bc", "network": {"id": "855ecfce-44da-448e-8c49-042beea13b27", "bridge": "br-int", "label": "tempest-network-smoke--1020370265", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd12ccaa4-20", "ovs_interfaceid": "d12ccaa4-2084-47aa-9d33-a6fe5a03b378", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.927 186844 DEBUG nova.network.os_vif_util [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:25:bc,bridge_name='br-int',has_traffic_filtering=True,id=d12ccaa4-2084-47aa-9d33-a6fe5a03b378,network=Network(855ecfce-44da-448e-8c49-042beea13b27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd12ccaa4-20') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.927 186844 DEBUG os_vif [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:25:bc,bridge_name='br-int',has_traffic_filtering=True,id=d12ccaa4-2084-47aa-9d33-a6fe5a03b378,network=Network(855ecfce-44da-448e-8c49-042beea13b27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd12ccaa4-20') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.929 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.929 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd12ccaa4-20, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.931 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.932 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.934 186844 INFO os_vif [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:25:bc,bridge_name='br-int',has_traffic_filtering=True,id=d12ccaa4-2084-47aa-9d33-a6fe5a03b378,network=Network(855ecfce-44da-448e-8c49-042beea13b27),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd12ccaa4-20')
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.934 186844 INFO nova.virt.libvirt.driver [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Deleting instance files /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb_del
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.935 186844 INFO nova.virt.libvirt.driver [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Deletion of /var/lib/nova/instances/32210ed6-c54d-46f8-8f4c-28d3ebd66edb_del complete
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.941 186844 DEBUG nova.compute.manager [req-010351b9-4e97-40f3-96a1-d38d19430261 req-dc31f6d2-8d8a-4267-ac2b-3c17e25da1bf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-vif-unplugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.941 186844 DEBUG oslo_concurrency.lockutils [req-010351b9-4e97-40f3-96a1-d38d19430261 req-dc31f6d2-8d8a-4267-ac2b-3c17e25da1bf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.941 186844 DEBUG oslo_concurrency.lockutils [req-010351b9-4e97-40f3-96a1-d38d19430261 req-dc31f6d2-8d8a-4267-ac2b-3c17e25da1bf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.942 186844 DEBUG oslo_concurrency.lockutils [req-010351b9-4e97-40f3-96a1-d38d19430261 req-dc31f6d2-8d8a-4267-ac2b-3c17e25da1bf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.942 186844 DEBUG nova.compute.manager [req-010351b9-4e97-40f3-96a1-d38d19430261 req-dc31f6d2-8d8a-4267-ac2b-3c17e25da1bf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] No waiting events found dispatching network-vif-unplugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.942 186844 DEBUG nova.compute.manager [req-010351b9-4e97-40f3-96a1-d38d19430261 req-dc31f6d2-8d8a-4267-ac2b-3c17e25da1bf 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-vif-unplugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 27 17:14:58 compute-0 podman[217095]: 2026-02-27 17:14:58.964479827 +0000 UTC m=+0.053726766 container remove ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 27 17:14:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:58.968 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[72d0897d-2e07-49ef-8ccf-6bf2543c358f]: (4, ('Fri Feb 27 05:14:58 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27 (ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a)\nec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a\nFri Feb 27 05:14:58 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-855ecfce-44da-448e-8c49-042beea13b27 (ec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a)\nec8a8e36f6c853ee96b36da6860d25eaeece222ea17693b3778bf1f1e2d3140a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:58.969 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[27147a00-4b4f-4e66-a0cf-b4d01b317550]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:58.970 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap855ecfce-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.972 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:58 compute-0 kernel: tap855ecfce-40: left promiscuous mode
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.975 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.979 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:14:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:58.980 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[e1546343-0a5e-45a1-9df3-9ed8de15484d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.988 186844 INFO nova.compute.manager [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.989 186844 DEBUG oslo.service.loopingcall [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.989 186844 DEBUG nova.compute.manager [-] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:14:58 compute-0 nova_compute[186840]: 2026-02-27 17:14:58.989 186844 DEBUG nova.network.neutron [-] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:14:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:58.994 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[dce310d7-bd55-48f9-88f3-5f9bc0615039]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:58 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:58.995 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[71453655-d127-48df-935b-fa49d607add7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:59 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:59.009 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[c0a4fb6d-4d38-4c3c-8d09-0404a7333c86]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 333409, 'reachable_time': 39033, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217115, 'error': None, 'target': 'ovnmeta-855ecfce-44da-448e-8c49-042beea13b27', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:59 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:59.010 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-855ecfce-44da-448e-8c49-042beea13b27 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:14:59 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:14:59.011 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[babee28f-d6e2-42f4-a055-65bdc015fe6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:14:59 compute-0 systemd[1]: run-netns-ovnmeta\x2d855ecfce\x2d44da\x2d448e\x2d8c49\x2d042beea13b27.mount: Deactivated successfully.
Feb 27 17:14:59 compute-0 nova_compute[186840]: 2026-02-27 17:14:59.482 186844 DEBUG nova.network.neutron [-] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:14:59 compute-0 nova_compute[186840]: 2026-02-27 17:14:59.502 186844 INFO nova.compute.manager [-] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Took 0.51 seconds to deallocate network for instance.
Feb 27 17:14:59 compute-0 nova_compute[186840]: 2026-02-27 17:14:59.565 186844 DEBUG oslo_concurrency.lockutils [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:14:59 compute-0 nova_compute[186840]: 2026-02-27 17:14:59.565 186844 DEBUG oslo_concurrency.lockutils [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:14:59 compute-0 nova_compute[186840]: 2026-02-27 17:14:59.642 186844 DEBUG nova.compute.provider_tree [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:14:59 compute-0 nova_compute[186840]: 2026-02-27 17:14:59.664 186844 DEBUG nova.scheduler.client.report [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:14:59 compute-0 nova_compute[186840]: 2026-02-27 17:14:59.691 186844 DEBUG oslo_concurrency.lockutils [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:14:59 compute-0 nova_compute[186840]: 2026-02-27 17:14:59.715 186844 INFO nova.scheduler.client.report [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance 32210ed6-c54d-46f8-8f4c-28d3ebd66edb
Feb 27 17:14:59 compute-0 nova_compute[186840]: 2026-02-27 17:14:59.795 186844 DEBUG oslo_concurrency.lockutils [None req-9eaae655-27db-4f3d-9fc7-7f62eb09466e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:00 compute-0 nova_compute[186840]: 2026-02-27 17:15:00.301 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:00 compute-0 nova_compute[186840]: 2026-02-27 17:15:00.522 186844 DEBUG nova.compute.manager [req-3d10f211-dd96-4b38-9cba-b6c432f7c9e6 req-bf2d2fc1-e408-49e8-b030-dc41cf292b15 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-changed-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:15:00 compute-0 nova_compute[186840]: 2026-02-27 17:15:00.522 186844 DEBUG nova.compute.manager [req-3d10f211-dd96-4b38-9cba-b6c432f7c9e6 req-bf2d2fc1-e408-49e8-b030-dc41cf292b15 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Refreshing instance network info cache due to event network-changed-d12ccaa4-2084-47aa-9d33-a6fe5a03b378. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:15:00 compute-0 nova_compute[186840]: 2026-02-27 17:15:00.522 186844 DEBUG oslo_concurrency.lockutils [req-3d10f211-dd96-4b38-9cba-b6c432f7c9e6 req-bf2d2fc1-e408-49e8-b030-dc41cf292b15 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:15:00 compute-0 nova_compute[186840]: 2026-02-27 17:15:00.522 186844 DEBUG oslo_concurrency.lockutils [req-3d10f211-dd96-4b38-9cba-b6c432f7c9e6 req-bf2d2fc1-e408-49e8-b030-dc41cf292b15 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:15:00 compute-0 nova_compute[186840]: 2026-02-27 17:15:00.523 186844 DEBUG nova.network.neutron [req-3d10f211-dd96-4b38-9cba-b6c432f7c9e6 req-bf2d2fc1-e408-49e8-b030-dc41cf292b15 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Refreshing network info cache for port d12ccaa4-2084-47aa-9d33-a6fe5a03b378 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:15:01 compute-0 nova_compute[186840]: 2026-02-27 17:15:01.102 186844 DEBUG nova.compute.manager [req-0fb3fee4-5a0e-4ce2-994a-ed082eb1af51 req-71a8867c-358f-47f3-aed5-067d236d4ad1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-vif-plugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:15:01 compute-0 nova_compute[186840]: 2026-02-27 17:15:01.103 186844 DEBUG oslo_concurrency.lockutils [req-0fb3fee4-5a0e-4ce2-994a-ed082eb1af51 req-71a8867c-358f-47f3-aed5-067d236d4ad1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:01 compute-0 nova_compute[186840]: 2026-02-27 17:15:01.103 186844 DEBUG oslo_concurrency.lockutils [req-0fb3fee4-5a0e-4ce2-994a-ed082eb1af51 req-71a8867c-358f-47f3-aed5-067d236d4ad1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:01 compute-0 nova_compute[186840]: 2026-02-27 17:15:01.103 186844 DEBUG oslo_concurrency.lockutils [req-0fb3fee4-5a0e-4ce2-994a-ed082eb1af51 req-71a8867c-358f-47f3-aed5-067d236d4ad1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "32210ed6-c54d-46f8-8f4c-28d3ebd66edb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:01 compute-0 nova_compute[186840]: 2026-02-27 17:15:01.104 186844 DEBUG nova.compute.manager [req-0fb3fee4-5a0e-4ce2-994a-ed082eb1af51 req-71a8867c-358f-47f3-aed5-067d236d4ad1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] No waiting events found dispatching network-vif-plugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:15:01 compute-0 nova_compute[186840]: 2026-02-27 17:15:01.104 186844 WARNING nova.compute.manager [req-0fb3fee4-5a0e-4ce2-994a-ed082eb1af51 req-71a8867c-358f-47f3-aed5-067d236d4ad1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received unexpected event network-vif-plugged-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 for instance with vm_state deleted and task_state None.
Feb 27 17:15:01 compute-0 nova_compute[186840]: 2026-02-27 17:15:01.104 186844 DEBUG nova.compute.manager [req-0fb3fee4-5a0e-4ce2-994a-ed082eb1af51 req-71a8867c-358f-47f3-aed5-067d236d4ad1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Received event network-vif-deleted-d12ccaa4-2084-47aa-9d33-a6fe5a03b378 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:15:01 compute-0 nova_compute[186840]: 2026-02-27 17:15:01.162 186844 DEBUG nova.network.neutron [req-3d10f211-dd96-4b38-9cba-b6c432f7c9e6 req-bf2d2fc1-e408-49e8-b030-dc41cf292b15 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:15:01 compute-0 podman[217116]: 2026-02-27 17:15:01.659091239 +0000 UTC m=+0.056980194 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:15:02 compute-0 nova_compute[186840]: 2026-02-27 17:15:02.341 186844 DEBUG nova.network.neutron [req-3d10f211-dd96-4b38-9cba-b6c432f7c9e6 req-bf2d2fc1-e408-49e8-b030-dc41cf292b15 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Feb 27 17:15:02 compute-0 nova_compute[186840]: 2026-02-27 17:15:02.341 186844 DEBUG oslo_concurrency.lockutils [req-3d10f211-dd96-4b38-9cba-b6c432f7c9e6 req-bf2d2fc1-e408-49e8-b030-dc41cf292b15 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-32210ed6-c54d-46f8-8f4c-28d3ebd66edb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:15:03 compute-0 nova_compute[186840]: 2026-02-27 17:15:03.933 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:04 compute-0 podman[217139]: 2026-02-27 17:15:04.682473835 +0000 UTC m=+0.088533751 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260223)
Feb 27 17:15:05 compute-0 nova_compute[186840]: 2026-02-27 17:15:05.302 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:06 compute-0 nova_compute[186840]: 2026-02-27 17:15:06.547 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:06 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:06.546 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:15:06 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:06.548 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:15:07 compute-0 nova_compute[186840]: 2026-02-27 17:15:07.714 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:15:07 compute-0 podman[217159]: 2026-02-27 17:15:07.735189961 +0000 UTC m=+0.141308160 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 27 17:15:07 compute-0 nova_compute[186840]: 2026-02-27 17:15:07.788 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:07 compute-0 nova_compute[186840]: 2026-02-27 17:15:07.831 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:08 compute-0 nova_compute[186840]: 2026-02-27 17:15:08.695 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:15:08 compute-0 nova_compute[186840]: 2026-02-27 17:15:08.723 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:15:08 compute-0 nova_compute[186840]: 2026-02-27 17:15:08.935 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:09 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:09.550 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:15:09 compute-0 nova_compute[186840]: 2026-02-27 17:15:09.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:15:09 compute-0 nova_compute[186840]: 2026-02-27 17:15:09.727 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:09 compute-0 nova_compute[186840]: 2026-02-27 17:15:09.728 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:09 compute-0 nova_compute[186840]: 2026-02-27 17:15:09.729 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:09 compute-0 nova_compute[186840]: 2026-02-27 17:15:09.730 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:15:09 compute-0 podman[217187]: 2026-02-27 17:15:09.867328218 +0000 UTC m=+0.087543416 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 27 17:15:09 compute-0 nova_compute[186840]: 2026-02-27 17:15:09.926 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:15:09 compute-0 nova_compute[186840]: 2026-02-27 17:15:09.928 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5771MB free_disk=73.19416046142578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:15:09 compute-0 nova_compute[186840]: 2026-02-27 17:15:09.928 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:09 compute-0 nova_compute[186840]: 2026-02-27 17:15:09.928 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:10 compute-0 nova_compute[186840]: 2026-02-27 17:15:10.017 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:15:10 compute-0 nova_compute[186840]: 2026-02-27 17:15:10.018 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:15:10 compute-0 nova_compute[186840]: 2026-02-27 17:15:10.046 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:15:10 compute-0 nova_compute[186840]: 2026-02-27 17:15:10.066 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:15:10 compute-0 nova_compute[186840]: 2026-02-27 17:15:10.097 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:15:10 compute-0 nova_compute[186840]: 2026-02-27 17:15:10.097 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:10 compute-0 nova_compute[186840]: 2026-02-27 17:15:10.304 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:11 compute-0 nova_compute[186840]: 2026-02-27 17:15:11.095 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:15:11 compute-0 nova_compute[186840]: 2026-02-27 17:15:11.095 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:15:12 compute-0 nova_compute[186840]: 2026-02-27 17:15:12.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:15:12 compute-0 nova_compute[186840]: 2026-02-27 17:15:12.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:15:12 compute-0 nova_compute[186840]: 2026-02-27 17:15:12.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:15:12 compute-0 nova_compute[186840]: 2026-02-27 17:15:12.727 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:15:12 compute-0 nova_compute[186840]: 2026-02-27 17:15:12.728 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:15:12 compute-0 nova_compute[186840]: 2026-02-27 17:15:12.729 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:15:12 compute-0 nova_compute[186840]: 2026-02-27 17:15:12.729 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:15:13 compute-0 nova_compute[186840]: 2026-02-27 17:15:13.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:15:13 compute-0 nova_compute[186840]: 2026-02-27 17:15:13.909 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212498.9080184, 32210ed6-c54d-46f8-8f4c-28d3ebd66edb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:15:13 compute-0 nova_compute[186840]: 2026-02-27 17:15:13.910 186844 INFO nova.compute.manager [-] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] VM Stopped (Lifecycle Event)
Feb 27 17:15:13 compute-0 nova_compute[186840]: 2026-02-27 17:15:13.936 186844 DEBUG nova.compute.manager [None req-4d893770-696b-44bc-b293-1ecf780bd98a - - - - - -] [instance: 32210ed6-c54d-46f8-8f4c-28d3ebd66edb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:15:13 compute-0 nova_compute[186840]: 2026-02-27 17:15:13.938 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:15 compute-0 nova_compute[186840]: 2026-02-27 17:15:15.307 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:15 compute-0 podman[217209]: 2026-02-27 17:15:15.708882475 +0000 UTC m=+0.110895504 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Feb 27 17:15:18 compute-0 nova_compute[186840]: 2026-02-27 17:15:18.941 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:20 compute-0 nova_compute[186840]: 2026-02-27 17:15:20.309 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:20 compute-0 podman[217230]: 2026-02-27 17:15:20.649112101 +0000 UTC m=+0.053549121 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 27 17:15:23 compute-0 nova_compute[186840]: 2026-02-27 17:15:23.971 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:25 compute-0 nova_compute[186840]: 2026-02-27 17:15:25.311 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:26 compute-0 sshd-session[217257]: Invalid user admin from 104.234.37.243 port 40764
Feb 27 17:15:26 compute-0 sshd-session[217257]: Connection closed by invalid user admin 104.234.37.243 port 40764 [preauth]
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.392 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "0b855bd8-86fd-4396-a811-1630f320c70a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.393 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.415 186844 DEBUG nova.compute.manager [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.530 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.530 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.541 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.542 186844 INFO nova.compute.claims [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.699 186844 DEBUG nova.compute.provider_tree [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.717 186844 DEBUG nova.scheduler.client.report [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.750 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.751 186844 DEBUG nova.compute.manager [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.817 186844 DEBUG nova.compute.manager [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.818 186844 DEBUG nova.network.neutron [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.844 186844 INFO nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:15:26 compute-0 nova_compute[186840]: 2026-02-27 17:15:26.869 186844 DEBUG nova.compute.manager [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.000 186844 DEBUG nova.compute.manager [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.002 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.002 186844 INFO nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Creating image(s)
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.003 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.004 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.005 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.029 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.105 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.106 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.107 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.128 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.197 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.198 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.228 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.229 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.230 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.283 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.285 186844 DEBUG nova.virt.disk.api [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.285 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.329 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.330 186844 DEBUG nova.virt.disk.api [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.330 186844 DEBUG nova.objects.instance [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid 0b855bd8-86fd-4396-a811-1630f320c70a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.350 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.350 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Ensure instance console log exists: /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.351 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.351 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.352 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:27 compute-0 nova_compute[186840]: 2026-02-27 17:15:27.622 186844 DEBUG nova.policy [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:15:28 compute-0 nova_compute[186840]: 2026-02-27 17:15:28.974 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:29 compute-0 nova_compute[186840]: 2026-02-27 17:15:29.701 186844 DEBUG nova.network.neutron [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Successfully created port: 50205493-2beb-456a-b247-be275b820e6f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:15:30 compute-0 nova_compute[186840]: 2026-02-27 17:15:30.315 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:31 compute-0 nova_compute[186840]: 2026-02-27 17:15:31.030 186844 DEBUG nova.network.neutron [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Successfully updated port: 50205493-2beb-456a-b247-be275b820e6f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:15:31 compute-0 nova_compute[186840]: 2026-02-27 17:15:31.063 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:15:31 compute-0 nova_compute[186840]: 2026-02-27 17:15:31.064 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:15:31 compute-0 nova_compute[186840]: 2026-02-27 17:15:31.064 186844 DEBUG nova.network.neutron [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:15:31 compute-0 nova_compute[186840]: 2026-02-27 17:15:31.143 186844 DEBUG nova.compute.manager [req-43c62f4c-d336-4cc2-8621-34bb64705050 req-36f64f9e-f34d-4ffd-8e58-202187f4cbd9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Received event network-changed-50205493-2beb-456a-b247-be275b820e6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:15:31 compute-0 nova_compute[186840]: 2026-02-27 17:15:31.144 186844 DEBUG nova.compute.manager [req-43c62f4c-d336-4cc2-8621-34bb64705050 req-36f64f9e-f34d-4ffd-8e58-202187f4cbd9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Refreshing instance network info cache due to event network-changed-50205493-2beb-456a-b247-be275b820e6f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:15:31 compute-0 nova_compute[186840]: 2026-02-27 17:15:31.144 186844 DEBUG oslo_concurrency.lockutils [req-43c62f4c-d336-4cc2-8621-34bb64705050 req-36f64f9e-f34d-4ffd-8e58-202187f4cbd9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:15:31 compute-0 nova_compute[186840]: 2026-02-27 17:15:31.605 186844 DEBUG nova.network.neutron [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:15:32 compute-0 podman[217274]: 2026-02-27 17:15:32.669165796 +0000 UTC m=+0.071678941 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.672 186844 DEBUG nova.network.neutron [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Updating instance_info_cache with network_info: [{"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.698 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.699 186844 DEBUG nova.compute.manager [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Instance network_info: |[{"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.699 186844 DEBUG oslo_concurrency.lockutils [req-43c62f4c-d336-4cc2-8621-34bb64705050 req-36f64f9e-f34d-4ffd-8e58-202187f4cbd9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.700 186844 DEBUG nova.network.neutron [req-43c62f4c-d336-4cc2-8621-34bb64705050 req-36f64f9e-f34d-4ffd-8e58-202187f4cbd9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Refreshing network info cache for port 50205493-2beb-456a-b247-be275b820e6f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.705 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Start _get_guest_xml network_info=[{"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.712 186844 WARNING nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.726 186844 DEBUG nova.virt.libvirt.host [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.727 186844 DEBUG nova.virt.libvirt.host [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.732 186844 DEBUG nova.virt.libvirt.host [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.733 186844 DEBUG nova.virt.libvirt.host [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.733 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.734 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.735 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.735 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.736 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.736 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.736 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.737 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.737 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.738 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.738 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.739 186844 DEBUG nova.virt.hardware [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.745 186844 DEBUG nova.virt.libvirt.vif [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1739984776',display_name='tempest-TestNetworkBasicOps-server-1739984776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1739984776',id=4,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHroL+RIEyO1BCGLiULgyrlRVrkJ2/jG+ilxFYWubA9ICDFMAn6k7ZewT9gI/rK4tbB6inoPxZIT4kIvecMy58s6ETmSEJvSjqWBiGY4HHC3543wLVK/wxyF1iE5ZBmkzw==',key_name='tempest-TestNetworkBasicOps-1006816395',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-1zjsa5ww',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:15:26Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=0b855bd8-86fd-4396-a811-1630f320c70a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.745 186844 DEBUG nova.network.os_vif_util [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.747 186844 DEBUG nova.network.os_vif_util [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:58:7c,bridge_name='br-int',has_traffic_filtering=True,id=50205493-2beb-456a-b247-be275b820e6f,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50205493-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.748 186844 DEBUG nova.objects.instance [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 0b855bd8-86fd-4396-a811-1630f320c70a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.769 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:15:32 compute-0 nova_compute[186840]:   <uuid>0b855bd8-86fd-4396-a811-1630f320c70a</uuid>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   <name>instance-00000004</name>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1739984776</nova:name>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:15:32</nova:creationTime>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:15:32 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:15:32 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:15:32 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:15:32 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:15:32 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:15:32 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:15:32 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:15:32 compute-0 nova_compute[186840]:         <nova:port uuid="50205493-2beb-456a-b247-be275b820e6f">
Feb 27 17:15:32 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <system>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <entry name="serial">0b855bd8-86fd-4396-a811-1630f320c70a</entry>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <entry name="uuid">0b855bd8-86fd-4396-a811-1630f320c70a</entry>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     </system>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   <os>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   </os>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   <features>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   </features>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk.config"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:8a:58:7c"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <target dev="tap50205493-2b"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/console.log" append="off"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <video>
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     </video>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:15:32 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:15:32 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:15:32 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:15:32 compute-0 nova_compute[186840]: </domain>
Feb 27 17:15:32 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.770 186844 DEBUG nova.compute.manager [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Preparing to wait for external event network-vif-plugged-50205493-2beb-456a-b247-be275b820e6f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.771 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.772 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.772 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.773 186844 DEBUG nova.virt.libvirt.vif [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1739984776',display_name='tempest-TestNetworkBasicOps-server-1739984776',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1739984776',id=4,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHroL+RIEyO1BCGLiULgyrlRVrkJ2/jG+ilxFYWubA9ICDFMAn6k7ZewT9gI/rK4tbB6inoPxZIT4kIvecMy58s6ETmSEJvSjqWBiGY4HHC3543wLVK/wxyF1iE5ZBmkzw==',key_name='tempest-TestNetworkBasicOps-1006816395',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-1zjsa5ww',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:15:26Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=0b855bd8-86fd-4396-a811-1630f320c70a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.773 186844 DEBUG nova.network.os_vif_util [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.774 186844 DEBUG nova.network.os_vif_util [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8a:58:7c,bridge_name='br-int',has_traffic_filtering=True,id=50205493-2beb-456a-b247-be275b820e6f,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50205493-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.774 186844 DEBUG os_vif [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:58:7c,bridge_name='br-int',has_traffic_filtering=True,id=50205493-2beb-456a-b247-be275b820e6f,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50205493-2b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.776 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.777 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.778 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.782 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.783 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50205493-2b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.784 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap50205493-2b, col_values=(('external_ids', {'iface-id': '50205493-2beb-456a-b247-be275b820e6f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8a:58:7c', 'vm-uuid': '0b855bd8-86fd-4396-a811-1630f320c70a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.787 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:32 compute-0 NetworkManager[56537]: <info>  [1772212532.7880] manager: (tap50205493-2b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.790 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.793 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.794 186844 INFO os_vif [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8a:58:7c,bridge_name='br-int',has_traffic_filtering=True,id=50205493-2beb-456a-b247-be275b820e6f,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50205493-2b')
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.846 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.847 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.847 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:8a:58:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:15:32 compute-0 nova_compute[186840]: 2026-02-27 17:15:32.848 186844 INFO nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Using config drive
Feb 27 17:15:33 compute-0 nova_compute[186840]: 2026-02-27 17:15:33.626 186844 INFO nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Creating config drive at /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk.config
Feb 27 17:15:33 compute-0 nova_compute[186840]: 2026-02-27 17:15:33.632 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpg5min337 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:15:33 compute-0 nova_compute[186840]: 2026-02-27 17:15:33.757 186844 DEBUG oslo_concurrency.processutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpg5min337" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:15:33 compute-0 kernel: tap50205493-2b: entered promiscuous mode
Feb 27 17:15:33 compute-0 NetworkManager[56537]: <info>  [1772212533.8198] manager: (tap50205493-2b): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Feb 27 17:15:33 compute-0 ovn_controller[96756]: 2026-02-27T17:15:33Z|00065|binding|INFO|Claiming lport 50205493-2beb-456a-b247-be275b820e6f for this chassis.
Feb 27 17:15:33 compute-0 nova_compute[186840]: 2026-02-27 17:15:33.822 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:33 compute-0 ovn_controller[96756]: 2026-02-27T17:15:33Z|00066|binding|INFO|50205493-2beb-456a-b247-be275b820e6f: Claiming fa:16:3e:8a:58:7c 10.100.0.13
Feb 27 17:15:33 compute-0 nova_compute[186840]: 2026-02-27 17:15:33.827 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:33 compute-0 nova_compute[186840]: 2026-02-27 17:15:33.831 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.844 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:58:7c 10.100.0.13'], port_security=['fa:16:3e:8a:58:7c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9fb6c0dc-f5cc-4283-9d29-8d3151bcfacb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c0ac065-8e7b-4bf9-8813-be70fe636457, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=50205493-2beb-456a-b247-be275b820e6f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.846 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 50205493-2beb-456a-b247-be275b820e6f in datapath 20273fc4-3c4b-4ae3-a6ed-448130c129da bound to our chassis
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.848 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20273fc4-3c4b-4ae3-a6ed-448130c129da
Feb 27 17:15:33 compute-0 systemd-machined[156136]: New machine qemu-4-instance-00000004.
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.866 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9344ab-aba2-42b9-b36e-091cd0a69b68]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.867 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap20273fc4-31 in ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.870 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap20273fc4-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.870 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[16d9c639-0f9a-4588-8de5-cc526552ff33]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:33 compute-0 ovn_controller[96756]: 2026-02-27T17:15:33Z|00067|binding|INFO|Setting lport 50205493-2beb-456a-b247-be275b820e6f ovn-installed in OVS
Feb 27 17:15:33 compute-0 ovn_controller[96756]: 2026-02-27T17:15:33Z|00068|binding|INFO|Setting lport 50205493-2beb-456a-b247-be275b820e6f up in Southbound
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.871 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[53fce8b3-7d68-47bd-82de-8994bcd485df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:33 compute-0 nova_compute[186840]: 2026-02-27 17:15:33.872 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:33 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Feb 27 17:15:33 compute-0 systemd-udevd[217320]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.884 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[68c8289c-67c3-4ec2-8f79-c0fbad000892]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:33 compute-0 NetworkManager[56537]: <info>  [1772212533.8984] device (tap50205493-2b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:15:33 compute-0 NetworkManager[56537]: <info>  [1772212533.8995] device (tap50205493-2b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.904 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[1169e459-5cd0-427c-a2d1-5c886b44ed20]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.936 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[492c8e24-cc12-4fb4-9ec3-5d3952899be8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.942 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[de3ac82a-13af-42e4-9073-d207640e69c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:33 compute-0 NetworkManager[56537]: <info>  [1772212533.9438] manager: (tap20273fc4-30): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Feb 27 17:15:33 compute-0 systemd-udevd[217323]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.980 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[5389485d-4061-4fac-91cb-e347bd927c11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:33 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:33.986 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[baa400d4-8426-45f2-b550-1ca2e9d95a0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:34 compute-0 NetworkManager[56537]: <info>  [1772212534.0093] device (tap20273fc4-30): carrier: link connected
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.014 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[8231fe08-6289-4a7e-a1a7-6121673fef9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.028 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[b58e50bb-c924-4ac5-a17f-a5093e6d1586]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20273fc4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:17:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340950, 'reachable_time': 37921, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217351, 'error': None, 'target': 'ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.040 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb8d6ba-4447-40d7-876e-190af704e425]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0f:173f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 340950, 'tstamp': 340950}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217352, 'error': None, 'target': 'ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.054 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e13f13-f2bb-4d7f-bd0a-f33ca583f77b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20273fc4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:17:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340950, 'reachable_time': 37921, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217353, 'error': None, 'target': 'ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.079 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[888d4f48-ada7-4b71-9549-fa21deb736a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.143 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[e5889b51-9e02-4ebf-9b2e-e80bf5d57733]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.144 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20273fc4-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.145 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.145 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20273fc4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.146 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:34 compute-0 NetworkManager[56537]: <info>  [1772212534.1474] manager: (tap20273fc4-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Feb 27 17:15:34 compute-0 kernel: tap20273fc4-30: entered promiscuous mode
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.150 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.153 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20273fc4-30, col_values=(('external_ids', {'iface-id': '16bd8d62-0b43-4a50-9c21-389d82182e3f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:15:34 compute-0 ovn_controller[96756]: 2026-02-27T17:15:34Z|00069|binding|INFO|Releasing lport 16bd8d62-0b43-4a50-9c21-389d82182e3f from this chassis (sb_readonly=0)
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.157 186844 DEBUG nova.compute.manager [req-7f0b85c9-2a54-4ac7-a90e-34639311d8d2 req-ff1655b4-41e6-4931-bb36-21d6bebb58a4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Received event network-vif-plugged-50205493-2beb-456a-b247-be275b820e6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.157 186844 DEBUG oslo_concurrency.lockutils [req-7f0b85c9-2a54-4ac7-a90e-34639311d8d2 req-ff1655b4-41e6-4931-bb36-21d6bebb58a4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.158 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/20273fc4-3c4b-4ae3-a6ed-448130c129da.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/20273fc4-3c4b-4ae3-a6ed-448130c129da.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.158 186844 DEBUG oslo_concurrency.lockutils [req-7f0b85c9-2a54-4ac7-a90e-34639311d8d2 req-ff1655b4-41e6-4931-bb36-21d6bebb58a4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.158 186844 DEBUG oslo_concurrency.lockutils [req-7f0b85c9-2a54-4ac7-a90e-34639311d8d2 req-ff1655b4-41e6-4931-bb36-21d6bebb58a4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.159 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[c31e6843-67a0-4a3d-a296-2cdd938ef49e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.159 186844 DEBUG nova.compute.manager [req-7f0b85c9-2a54-4ac7-a90e-34639311d8d2 req-ff1655b4-41e6-4931-bb36-21d6bebb58a4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Processing event network-vif-plugged-50205493-2beb-456a-b247-be275b820e6f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.159 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-20273fc4-3c4b-4ae3-a6ed-448130c129da
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/20273fc4-3c4b-4ae3-a6ed-448130c129da.pid.haproxy
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID 20273fc4-3c4b-4ae3-a6ed-448130c129da
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.159 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:34 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:34.160 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'env', 'PROCESS_TAG=haproxy-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/20273fc4-3c4b-4ae3-a6ed-448130c129da.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.163 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.170 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212534.1695433, 0b855bd8-86fd-4396-a811-1630f320c70a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.170 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] VM Started (Lifecycle Event)
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.173 186844 DEBUG nova.compute.manager [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.176 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.180 186844 INFO nova.virt.libvirt.driver [-] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Instance spawned successfully.
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.181 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.213 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.222 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.227 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.228 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.228 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.229 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.230 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.231 186844 DEBUG nova.virt.libvirt.driver [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.249 186844 DEBUG nova.network.neutron [req-43c62f4c-d336-4cc2-8621-34bb64705050 req-36f64f9e-f34d-4ffd-8e58-202187f4cbd9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Updated VIF entry in instance network info cache for port 50205493-2beb-456a-b247-be275b820e6f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.250 186844 DEBUG nova.network.neutron [req-43c62f4c-d336-4cc2-8621-34bb64705050 req-36f64f9e-f34d-4ffd-8e58-202187f4cbd9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Updating instance_info_cache with network_info: [{"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.267 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.268 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212534.169698, 0b855bd8-86fd-4396-a811-1630f320c70a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.269 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] VM Paused (Lifecycle Event)
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.290 186844 DEBUG oslo_concurrency.lockutils [req-43c62f4c-d336-4cc2-8621-34bb64705050 req-36f64f9e-f34d-4ffd-8e58-202187f4cbd9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.319 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.324 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212534.1751432, 0b855bd8-86fd-4396-a811-1630f320c70a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.325 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] VM Resumed (Lifecycle Event)
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.329 186844 INFO nova.compute.manager [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Took 7.33 seconds to spawn the instance on the hypervisor.
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.330 186844 DEBUG nova.compute.manager [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.343 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.347 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.372 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.392 186844 INFO nova.compute.manager [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Took 7.91 seconds to build instance.
Feb 27 17:15:34 compute-0 nova_compute[186840]: 2026-02-27 17:15:34.407 186844 DEBUG oslo_concurrency.lockutils [None req-c82eaf50-3985-4394-9748-97af91f484c4 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:34 compute-0 podman[217392]: 2026-02-27 17:15:34.568999133 +0000 UTC m=+0.062805815 container create ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 27 17:15:34 compute-0 systemd[1]: Started libpod-conmon-ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41.scope.
Feb 27 17:15:34 compute-0 podman[217392]: 2026-02-27 17:15:34.539027616 +0000 UTC m=+0.032834338 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:15:34 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:15:34 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08431d861e6e6240a3cf263904fd0fbf2131bc25c28f393eb5a233512bdd411b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:15:34 compute-0 podman[217392]: 2026-02-27 17:15:34.658923327 +0000 UTC m=+0.152730039 container init ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223)
Feb 27 17:15:34 compute-0 podman[217392]: 2026-02-27 17:15:34.664924763 +0000 UTC m=+0.158731475 container start ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 27 17:15:34 compute-0 neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da[217408]: [NOTICE]   (217412) : New worker (217414) forked
Feb 27 17:15:34 compute-0 neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da[217408]: [NOTICE]   (217412) : Loading success.
Feb 27 17:15:35 compute-0 nova_compute[186840]: 2026-02-27 17:15:35.330 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:35 compute-0 podman[217423]: 2026-02-27 17:15:35.663318822 +0000 UTC m=+0.066233759 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 27 17:15:36 compute-0 nova_compute[186840]: 2026-02-27 17:15:36.267 186844 DEBUG nova.compute.manager [req-861bd5c3-6e84-46d7-a1e4-73f74be63596 req-ccf41865-e58d-41a1-876a-677ee3ec3042 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Received event network-vif-plugged-50205493-2beb-456a-b247-be275b820e6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:15:36 compute-0 nova_compute[186840]: 2026-02-27 17:15:36.268 186844 DEBUG oslo_concurrency.lockutils [req-861bd5c3-6e84-46d7-a1e4-73f74be63596 req-ccf41865-e58d-41a1-876a-677ee3ec3042 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:36 compute-0 nova_compute[186840]: 2026-02-27 17:15:36.268 186844 DEBUG oslo_concurrency.lockutils [req-861bd5c3-6e84-46d7-a1e4-73f74be63596 req-ccf41865-e58d-41a1-876a-677ee3ec3042 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:36 compute-0 nova_compute[186840]: 2026-02-27 17:15:36.268 186844 DEBUG oslo_concurrency.lockutils [req-861bd5c3-6e84-46d7-a1e4-73f74be63596 req-ccf41865-e58d-41a1-876a-677ee3ec3042 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:36 compute-0 nova_compute[186840]: 2026-02-27 17:15:36.269 186844 DEBUG nova.compute.manager [req-861bd5c3-6e84-46d7-a1e4-73f74be63596 req-ccf41865-e58d-41a1-876a-677ee3ec3042 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] No waiting events found dispatching network-vif-plugged-50205493-2beb-456a-b247-be275b820e6f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:15:36 compute-0 nova_compute[186840]: 2026-02-27 17:15:36.269 186844 WARNING nova.compute.manager [req-861bd5c3-6e84-46d7-a1e4-73f74be63596 req-ccf41865-e58d-41a1-876a-677ee3ec3042 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Received unexpected event network-vif-plugged-50205493-2beb-456a-b247-be275b820e6f for instance with vm_state active and task_state None.
Feb 27 17:15:37 compute-0 NetworkManager[56537]: <info>  [1772212537.2727] manager: (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Feb 27 17:15:37 compute-0 NetworkManager[56537]: <info>  [1772212537.2734] manager: (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Feb 27 17:15:37 compute-0 nova_compute[186840]: 2026-02-27 17:15:37.274 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:37 compute-0 ovn_controller[96756]: 2026-02-27T17:15:37Z|00070|binding|INFO|Releasing lport 16bd8d62-0b43-4a50-9c21-389d82182e3f from this chassis (sb_readonly=0)
Feb 27 17:15:37 compute-0 ovn_controller[96756]: 2026-02-27T17:15:37Z|00071|binding|INFO|Releasing lport 16bd8d62-0b43-4a50-9c21-389d82182e3f from this chassis (sb_readonly=0)
Feb 27 17:15:37 compute-0 nova_compute[186840]: 2026-02-27 17:15:37.286 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:37 compute-0 nova_compute[186840]: 2026-02-27 17:15:37.810 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:38 compute-0 nova_compute[186840]: 2026-02-27 17:15:38.369 186844 DEBUG nova.compute.manager [req-e619ee8a-d544-41f1-9444-83a4f7f584e0 req-0c41fc36-f79d-4cec-82bb-9924ca1ec808 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Received event network-changed-50205493-2beb-456a-b247-be275b820e6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:15:38 compute-0 nova_compute[186840]: 2026-02-27 17:15:38.370 186844 DEBUG nova.compute.manager [req-e619ee8a-d544-41f1-9444-83a4f7f584e0 req-0c41fc36-f79d-4cec-82bb-9924ca1ec808 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Refreshing instance network info cache due to event network-changed-50205493-2beb-456a-b247-be275b820e6f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:15:38 compute-0 nova_compute[186840]: 2026-02-27 17:15:38.370 186844 DEBUG oslo_concurrency.lockutils [req-e619ee8a-d544-41f1-9444-83a4f7f584e0 req-0c41fc36-f79d-4cec-82bb-9924ca1ec808 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:15:38 compute-0 nova_compute[186840]: 2026-02-27 17:15:38.370 186844 DEBUG oslo_concurrency.lockutils [req-e619ee8a-d544-41f1-9444-83a4f7f584e0 req-0c41fc36-f79d-4cec-82bb-9924ca1ec808 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:15:38 compute-0 nova_compute[186840]: 2026-02-27 17:15:38.370 186844 DEBUG nova.network.neutron [req-e619ee8a-d544-41f1-9444-83a4f7f584e0 req-0c41fc36-f79d-4cec-82bb-9924ca1ec808 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Refreshing network info cache for port 50205493-2beb-456a-b247-be275b820e6f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:15:38 compute-0 podman[217444]: 2026-02-27 17:15:38.694137308 +0000 UTC m=+0.094451784 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 27 17:15:40 compute-0 nova_compute[186840]: 2026-02-27 17:15:40.368 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:40 compute-0 podman[217470]: 2026-02-27 17:15:40.642136934 +0000 UTC m=+0.054177347 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 27 17:15:41 compute-0 nova_compute[186840]: 2026-02-27 17:15:41.440 186844 DEBUG nova.network.neutron [req-e619ee8a-d544-41f1-9444-83a4f7f584e0 req-0c41fc36-f79d-4cec-82bb-9924ca1ec808 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Updated VIF entry in instance network info cache for port 50205493-2beb-456a-b247-be275b820e6f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:15:41 compute-0 nova_compute[186840]: 2026-02-27 17:15:41.440 186844 DEBUG nova.network.neutron [req-e619ee8a-d544-41f1-9444-83a4f7f584e0 req-0c41fc36-f79d-4cec-82bb-9924ca1ec808 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Updating instance_info_cache with network_info: [{"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:15:41 compute-0 nova_compute[186840]: 2026-02-27 17:15:41.457 186844 DEBUG oslo_concurrency.lockutils [req-e619ee8a-d544-41f1-9444-83a4f7f584e0 req-0c41fc36-f79d-4cec-82bb-9924ca1ec808 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:15:42 compute-0 nova_compute[186840]: 2026-02-27 17:15:42.813 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:45 compute-0 nova_compute[186840]: 2026-02-27 17:15:45.370 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:46 compute-0 ovn_controller[96756]: 2026-02-27T17:15:46Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8a:58:7c 10.100.0.13
Feb 27 17:15:46 compute-0 ovn_controller[96756]: 2026-02-27T17:15:46Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8a:58:7c 10.100.0.13
Feb 27 17:15:46 compute-0 podman[217510]: 2026-02-27 17:15:46.675585569 +0000 UTC m=+0.075497554 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Feb 27 17:15:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:47.089 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:15:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:47.090 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:15:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:15:47.091 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:15:47 compute-0 nova_compute[186840]: 2026-02-27 17:15:47.815 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:50 compute-0 nova_compute[186840]: 2026-02-27 17:15:50.371 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:51 compute-0 podman[217530]: 2026-02-27 17:15:51.658345603 +0000 UTC m=+0.064113528 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:15:52 compute-0 nova_compute[186840]: 2026-02-27 17:15:52.222 186844 INFO nova.compute.manager [None req-24a43774-4523-40e2-8bcc-96bc24daa472 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Get console output
Feb 27 17:15:52 compute-0 nova_compute[186840]: 2026-02-27 17:15:52.229 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:15:52 compute-0 nova_compute[186840]: 2026-02-27 17:15:52.820 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:54 compute-0 nova_compute[186840]: 2026-02-27 17:15:54.051 186844 DEBUG nova.compute.manager [req-d5133afa-337f-4afc-a78d-9f100a0d4bc5 req-0d23a973-88ab-48c5-996b-5add79b8f037 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Received event network-changed-50205493-2beb-456a-b247-be275b820e6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:15:54 compute-0 nova_compute[186840]: 2026-02-27 17:15:54.052 186844 DEBUG nova.compute.manager [req-d5133afa-337f-4afc-a78d-9f100a0d4bc5 req-0d23a973-88ab-48c5-996b-5add79b8f037 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Refreshing instance network info cache due to event network-changed-50205493-2beb-456a-b247-be275b820e6f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:15:54 compute-0 nova_compute[186840]: 2026-02-27 17:15:54.053 186844 DEBUG oslo_concurrency.lockutils [req-d5133afa-337f-4afc-a78d-9f100a0d4bc5 req-0d23a973-88ab-48c5-996b-5add79b8f037 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:15:54 compute-0 nova_compute[186840]: 2026-02-27 17:15:54.053 186844 DEBUG oslo_concurrency.lockutils [req-d5133afa-337f-4afc-a78d-9f100a0d4bc5 req-0d23a973-88ab-48c5-996b-5add79b8f037 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:15:54 compute-0 nova_compute[186840]: 2026-02-27 17:15:54.054 186844 DEBUG nova.network.neutron [req-d5133afa-337f-4afc-a78d-9f100a0d4bc5 req-0d23a973-88ab-48c5-996b-5add79b8f037 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Refreshing network info cache for port 50205493-2beb-456a-b247-be275b820e6f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:15:55 compute-0 nova_compute[186840]: 2026-02-27 17:15:55.373 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:15:55 compute-0 nova_compute[186840]: 2026-02-27 17:15:55.659 186844 DEBUG nova.network.neutron [req-d5133afa-337f-4afc-a78d-9f100a0d4bc5 req-0d23a973-88ab-48c5-996b-5add79b8f037 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Updated VIF entry in instance network info cache for port 50205493-2beb-456a-b247-be275b820e6f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:15:55 compute-0 nova_compute[186840]: 2026-02-27 17:15:55.659 186844 DEBUG nova.network.neutron [req-d5133afa-337f-4afc-a78d-9f100a0d4bc5 req-0d23a973-88ab-48c5-996b-5add79b8f037 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Updating instance_info_cache with network_info: [{"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:15:55 compute-0 nova_compute[186840]: 2026-02-27 17:15:55.689 186844 DEBUG oslo_concurrency.lockutils [req-d5133afa-337f-4afc-a78d-9f100a0d4bc5 req-0d23a973-88ab-48c5-996b-5add79b8f037 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:15:56 compute-0 sshd-session[217554]: Invalid user orangepi from 104.234.37.243 port 41858
Feb 27 17:15:56 compute-0 sshd-session[217554]: Connection closed by invalid user orangepi 104.234.37.243 port 41858 [preauth]
Feb 27 17:15:57 compute-0 nova_compute[186840]: 2026-02-27 17:15:57.823 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:00 compute-0 nova_compute[186840]: 2026-02-27 17:16:00.376 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:02 compute-0 nova_compute[186840]: 2026-02-27 17:16:02.828 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:03 compute-0 podman[217556]: 2026-02-27 17:16:03.676578636 +0000 UTC m=+0.071052086 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.422 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "b575a09b-b52a-4b4f-94f6-b3b96997b910" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.422 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.435 186844 DEBUG nova.compute.manager [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.509 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.510 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.523 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.524 186844 INFO nova.compute.claims [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.728 186844 DEBUG nova.compute.provider_tree [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.746 186844 DEBUG nova.scheduler.client.report [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.771 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.771 186844 DEBUG nova.compute.manager [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.822 186844 DEBUG nova.compute.manager [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.823 186844 DEBUG nova.network.neutron [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.837 186844 INFO nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.853 186844 DEBUG nova.compute.manager [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.931 186844 DEBUG nova.compute.manager [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.933 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.933 186844 INFO nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Creating image(s)
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.933 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.934 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.934 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:04 compute-0 nova_compute[186840]: 2026-02-27 17:16:04.945 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.005 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.006 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.006 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.019 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.076 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.077 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.187 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk 1073741824" returned: 0 in 0.111s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.188 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.189 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.267 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.269 186844 DEBUG nova.virt.disk.api [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.269 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.328 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.328 186844 DEBUG nova.virt.disk.api [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.329 186844 DEBUG nova.objects.instance [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid b575a09b-b52a-4b4f-94f6-b3b96997b910 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.355 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.355 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Ensure instance console log exists: /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.356 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.356 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.357 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.379 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:05 compute-0 nova_compute[186840]: 2026-02-27 17:16:05.655 186844 DEBUG nova.policy [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.689 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}f36a3c79908926fbf1ffe5a8c04712358aa840c4467430368ded5bfe2ce9aba3" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.769 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Fri, 27 Feb 2026 17:16:05 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-bb2f805e-31cb-47b2-88cc-39ab4bc6ddca x-openstack-request-id: req-bb2f805e-31cb-47b2-88cc-39ab4bc6ddca _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.769 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "94ebc954-199f-4f0d-87f9-2457e240157e", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/94ebc954-199f-4f0d-87f9-2457e240157e"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/94ebc954-199f-4f0d-87f9-2457e240157e"}]}, {"id": "a21147e3-c734-4efb-8cc1-463f16e819cd", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/a21147e3-c734-4efb-8cc1-463f16e819cd"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/a21147e3-c734-4efb-8cc1-463f16e819cd"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.770 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-bb2f805e-31cb-47b2-88cc-39ab4bc6ddca request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.772 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/a21147e3-c734-4efb-8cc1-463f16e819cd -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}f36a3c79908926fbf1ffe5a8c04712358aa840c4467430368ded5bfe2ce9aba3" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.842 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Fri, 27 Feb 2026 17:16:05 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-c3b3e6a9-10ca-4f72-bfec-2f9e8621d8c5 x-openstack-request-id: req-c3b3e6a9-10ca-4f72-bfec-2f9e8621d8c5 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.843 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "a21147e3-c734-4efb-8cc1-463f16e819cd", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/a21147e3-c734-4efb-8cc1-463f16e819cd"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/a21147e3-c734-4efb-8cc1-463f16e819cd"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.843 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/a21147e3-c734-4efb-8cc1-463f16e819cd used request id req-c3b3e6a9-10ca-4f72-bfec-2f9e8621d8c5 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.844 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'name': 'tempest-TestNetworkBasicOps-server-1739984776', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000004', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0922444e0aaf445884a7c2fa20793b1f', 'user_id': '427d6e526715473ebe8997007bbff5cd', 'hostId': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.844 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.858 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.859 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64e11464-6ccf-435b-98a7-c73a1ee4bc4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-vda', 'timestamp': '2026-02-27T17:16:05.845078', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '000cda30-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.39006372, 'message_signature': '5dc292f6555c3cf43ff675f7772e9a595aa6c078a5e564050841eaf1564bfb87'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-sda', 'timestamp': '2026-02-27T17:16:05.845078', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '000ce822-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.39006372, 'message_signature': 'd203824243b90bbac9de7c63bfd86968bbc7f15761b27f452ce572a9c3b79693'}]}, 'timestamp': '2026-02-27 17:16:05.859872', '_unique_id': 'f7a79b18c244494680a7ad1462b16884'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.863 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.865 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.894 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.write.requests volume: 286 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.894 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a98a8a5-dd4e-4dfe-85fe-13498526fe91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 286, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-vda', 'timestamp': '2026-02-27T17:16:05.865538', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00123354-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': 'd0033315c518cfc781bee6d29e0d38543f095702da59b84e5bb977b316261594'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-sda', 'timestamp': '2026-02-27T17:16:05.865538', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00124132-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': '4c39802dc9b8715724b61a02c9afea0c432ce66e9d8492aa4ebd52ae92cbbfc9'}]}, 'timestamp': '2026-02-27 17:16:05.894953', '_unique_id': '316074676e18444a80caadba4800157b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.896 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.897 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.897 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.897 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38ebeefa-2da8-4c40-9169-b49f7c575b4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-vda', 'timestamp': '2026-02-27T17:16:05.897356', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0012b036-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.39006372, 'message_signature': '1e3c7b0f23366441b345dccb19ab4c578a8a9b1433fcfb6bbe99f574ccc00a9b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-sda', 'timestamp': '2026-02-27T17:16:05.897356', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '0012c35a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.39006372, 'message_signature': '67e4ea9a27bc17979c2a7a99bc6e214ac1d25ed911596e8bcd26c22bdda3ff86'}]}, 'timestamp': '2026-02-27 17:16:05.898385', '_unique_id': '3e489460ed12417d8d2944821f82e61d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.899 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.900 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.904 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 0b855bd8-86fd-4396-a811-1630f320c70a / tap50205493-2b inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.905 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '367d39c9-aede-458d-b32c-0f53714de692', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000004-0b855bd8-86fd-4396-a811-1630f320c70a-tap50205493-2b', 'timestamp': '2026-02-27T17:16:05.900960', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'tap50205493-2b', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:58:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50205493-2b'}, 'message_id': '0013e118-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.445916066, 'message_signature': 'af11ea176b6152ea4d1275e729a69a5a60f1699f6b472f993a774c2eee2b9cbc'}]}, 'timestamp': '2026-02-27 17:16:05.905643', '_unique_id': '8f7f391600694657911c5076803c2274'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.906 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.907 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.907 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.907 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1739984776>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1739984776>]
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.908 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.908 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac3d8e3a-f24f-4458-94db-7ade4459c1f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000004-0b855bd8-86fd-4396-a811-1630f320c70a-tap50205493-2b', 'timestamp': '2026-02-27T17:16:05.908176', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'tap50205493-2b', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:58:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50205493-2b'}, 'message_id': '0014538c-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.445916066, 'message_signature': '5d5b3bb2630c1f4e78c9300831f918a324237ba226dd5895319f75a21682ce80'}]}, 'timestamp': '2026-02-27 17:16:05.908550', '_unique_id': '4055a901ece34f7bbb28b8ec29489081'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.909 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29f2ce05-140e-4440-a45e-cb3696923356', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000004-0b855bd8-86fd-4396-a811-1630f320c70a-tap50205493-2b', 'timestamp': '2026-02-27T17:16:05.909979', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'tap50205493-2b', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:58:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50205493-2b'}, 'message_id': '001496ee-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.445916066, 'message_signature': 'd52ab50d73e0c04f248f9ebd8e4daed4851c23a5826066859b4b22b0cee74c0f'}]}, 'timestamp': '2026-02-27 17:16:05.910211', '_unique_id': '2bca45560d6b40068c518f63d007e0aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.910 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.911 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.911 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/network.incoming.packets volume: 104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce1ba0f2-b749-4b36-90c8-f970490d5844', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 104, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000004-0b855bd8-86fd-4396-a811-1630f320c70a-tap50205493-2b', 'timestamp': '2026-02-27T17:16:05.911359', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'tap50205493-2b', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:58:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50205493-2b'}, 'message_id': '0014cccc-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.445916066, 'message_signature': '6882d75bc6f00c752f13356fd75f79aef277ff8e41bc1987a41bdaa96db9e47c'}]}, 'timestamp': '2026-02-27 17:16:05.911586', '_unique_id': 'ab67e12a49b542dd88d6c42c4ff7cae3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.912 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/network.incoming.bytes volume: 19156 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa71e376-b8ae-4c1b-9c88-884cbc749b5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19156, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000004-0b855bd8-86fd-4396-a811-1630f320c70a-tap50205493-2b', 'timestamp': '2026-02-27T17:16:05.912647', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'tap50205493-2b', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:58:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50205493-2b'}, 'message_id': '0014fefe-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.445916066, 'message_signature': '2656f694dd2ffcd9968c6e6f222ab90356ff20782cbe55e54998ad690b35b769'}]}, 'timestamp': '2026-02-27 17:16:05.912871', '_unique_id': '79bb6d53cd41417cafd10d42e1bb15a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.913 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.read.requests volume: 1099 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a03c54ea-7097-4ee1-b429-8891b306fccb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1099, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-vda', 'timestamp': '2026-02-27T17:16:05.913850', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00152e2e-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': '447625e4bf9251c2f1fde1e1cb9b661d1d4b0d28e5b2b35abcd0418c7721add9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-sda', 'timestamp': '2026-02-27T17:16:05.913850', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00153720-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': '91685e6326e8d6fd96df84da86a86eed8c7eacd2cba3521098b34d3ce40e14c3'}]}, 'timestamp': '2026-02-27 17:16:05.914316', '_unique_id': '77a8d1e15abf45ac9239f0fafa9dedd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.914 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.915 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.915 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.915 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1739984776>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1739984776>]
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.915 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.931 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/memory.usage volume: 42.48046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e291bb5-d893-4aee-9de3-eeba3cc7de6e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.48046875, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'timestamp': '2026-02-27T17:16:05.915689', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '0017fc3a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.475922895, 'message_signature': '68e7960f47fccd28c9788aca6e1894fb88a141f6d50528ce91536d43f83c3ddd'}]}, 'timestamp': '2026-02-27 17:16:05.932636', '_unique_id': 'e625125cc51d4beba9ecea8cc4812dc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.933 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.935 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.935 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.write.bytes volume: 72978432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.935 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '054b6f8b-2fdb-4f2e-8a6c-a2e5533642f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72978432, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-vda', 'timestamp': '2026-02-27T17:16:05.935393', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '00187d4a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': '097dc4fd6723effdb3be0023a990e6d109dc9b50ebde4d8fc8e8eea37c6bb2b1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-sda', 'timestamp': '2026-02-27T17:16:05.935393', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00188f38-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': '8ee5490217fff81465a4a2c4f13dae41a76427883712b41c2215c55c322895d9'}]}, 'timestamp': '2026-02-27 17:16:05.936369', '_unique_id': '0f6cbd6b836845978d0893d38c187646'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.937 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.938 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.938 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.write.latency volume: 7164477615 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.939 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8474ce7d-a14e-431f-a822-77ed4f5277d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7164477615, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-vda', 'timestamp': '2026-02-27T17:16:05.938636', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0018fb44-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': '3bb6821f027217a9de20d5d2929a2fb7674649d45d6fd8be3d71dbb3cece0d1e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-sda', 'timestamp': '2026-02-27T17:16:05.938636', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '00190ed6-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': '93782ef2680e34e8c5056003c8f7f80de5fa8cb7ef7171be8696ebe7aae24c0d'}]}, 'timestamp': '2026-02-27 17:16:05.939597', '_unique_id': '6aaf11baf6694b6783da9026a1400ee7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.940 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.941 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.942 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93bafff5-1c81-44af-8e5d-7026e6a831e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000004-0b855bd8-86fd-4396-a811-1630f320c70a-tap50205493-2b', 'timestamp': '2026-02-27T17:16:05.942136', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'tap50205493-2b', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:58:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50205493-2b'}, 'message_id': '001989c4-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.445916066, 'message_signature': '40b6dd6e5ce6bb3d3a270fb459d71a0b417e60394954462220b3b3e2389e77fe'}]}, 'timestamp': '2026-02-27 17:16:05.942780', '_unique_id': '17f96a28ce9a4fc4bce3bb9c5e633cfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.943 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.944 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.945 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.945 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca9332ba-f958-4c25-af2e-7bd3c46ba1ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-vda', 'timestamp': '2026-02-27T17:16:05.945123', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '0019f9d6-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.39006372, 'message_signature': '8a79c0e7a7b27e9f2a44266d28064b6469ea0021cda40cb69f6f297f83a5c4a4'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-sda', 'timestamp': '2026-02-27T17:16:05.945123', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '001a0b38-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.39006372, 'message_signature': 'df2b87dd3e1b4abd4bb0af610293207c9f24c1699648a28b03ac0facfe919f01'}]}, 'timestamp': '2026-02-27 17:16:05.946048', '_unique_id': '56f486ec154b42f2a452740c10c0d557'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.947 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.948 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.948 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/network.outgoing.bytes volume: 16166 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39732cf4-1787-4b17-8145-7c1a5bc20218', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16166, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000004-0b855bd8-86fd-4396-a811-1630f320c70a-tap50205493-2b', 'timestamp': '2026-02-27T17:16:05.948417', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'tap50205493-2b', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:58:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50205493-2b'}, 'message_id': '001a7a14-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.445916066, 'message_signature': 'b4da7424a05edc89ffe06967857e35f190100d5d282b32e7a50c08d851461ae9'}]}, 'timestamp': '2026-02-27 17:16:05.948919', '_unique_id': '160e02a2b5314af49ecc7213e4372b63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.949 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.950 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.951 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.read.latency volume: 714332540 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.951 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.read.latency volume: 165562543 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14ec47db-74f5-4098-addc-9055f0443db8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 714332540, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-vda', 'timestamp': '2026-02-27T17:16:05.951108', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '001ae580-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': '0d0a7f3c66a83360bcf744f5d23a879ad0506017199ad8a7f973233759a05a81'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 165562543, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-sda', 'timestamp': '2026-02-27T17:16:05.951108', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '001af71e-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': 'ea878ea12e3346bab3d0a7ff67d10f5b2fad7a1a13f63c0d58ad98f829b970ae'}]}, 'timestamp': '2026-02-27 17:16:05.952092', '_unique_id': '6959485fb3bc4ded9a6cdc8575db30a6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.953 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.954 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.954 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/network.outgoing.packets volume: 111 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3d1ecca-87e0-4a69-b8bf-5f04f99a999f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 111, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000004-0b855bd8-86fd-4396-a811-1630f320c70a-tap50205493-2b', 'timestamp': '2026-02-27T17:16:05.954354', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'tap50205493-2b', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:58:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50205493-2b'}, 'message_id': '001b6208-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.445916066, 'message_signature': 'd11e71578b988949ddf98833617551996ef08891ae68146c08781ea9595c47d7'}]}, 'timestamp': '2026-02-27 17:16:05.954856', '_unique_id': '7abf263389c54f31b1ce45c142f70001'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.955 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.957 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.957 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.957 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1739984776>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1739984776>]
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.957 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.957 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c78f31c5-01e8-4722-b5b9-864ce7a0d146', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000004-0b855bd8-86fd-4396-a811-1630f320c70a-tap50205493-2b', 'timestamp': '2026-02-27T17:16:05.957856', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'tap50205493-2b', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:58:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50205493-2b'}, 'message_id': '001beaca-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.445916066, 'message_signature': 'f53287f13723a63f737c5d006e183409e8a1bcbb762e7514a67636734a40034a'}]}, 'timestamp': '2026-02-27 17:16:05.958406', '_unique_id': 'ffb3868899c3494c832e8d378ad5579e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.959 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.960 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.960 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.read.bytes volume: 30693888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.961 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9cbc5575-ca3a-4b87-a882-ddff9f05572b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30693888, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-vda', 'timestamp': '2026-02-27T17:16:05.960763', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '001c5c44-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': '9e3091a958b9cf0b3c4019a0c5b75e282551b840c9dc4eee70cf7713836d9c61'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a-sda', 'timestamp': '2026-02-27T17:16:05.960763', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '001c6fc2-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.410444515, 'message_signature': 'd608880060f6f99ab923c072d4dbfe8fd58cfdf952e3eb09dcd6768ec4bfddbb'}]}, 'timestamp': '2026-02-27 17:16:05.961744', '_unique_id': '7ada4c14e6d44e65bd4089e4e6004a77'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.962 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.963 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.964 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.964 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1739984776>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1739984776>]
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.964 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.964 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f37183a0-7282-4a3d-931c-f4afccfbe0e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000004-0b855bd8-86fd-4396-a811-1630f320c70a-tap50205493-2b', 'timestamp': '2026-02-27T17:16:05.964755', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'tap50205493-2b', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8a:58:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap50205493-2b'}, 'message_id': '001cf816-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.445916066, 'message_signature': '00d7e2422ceeab2c58a21c8710fb9a00653ba3fa262c38f62c43699c4a14bada'}]}, 'timestamp': '2026-02-27 17:16:05.965301', '_unique_id': '960a9a25a2d94991978a78d11ef8061c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.966 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.967 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.967 12 DEBUG ceilometer.compute.pollsters [-] 0b855bd8-86fd-4396-a811-1630f320c70a/cpu volume: 10590000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57fdf797-154a-48e9-83c3-f5a0ded02da6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10590000000, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'timestamp': '2026-02-27T17:16:05.967623', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1739984776', 'name': 'instance-00000004', 'instance_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '001d680a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3441.475922895, 'message_signature': 'b3e735b90712d3fa52b81d91fc187f0cc8eeb2835be74993f0035fb5e6ee8a6a'}]}, 'timestamp': '2026-02-27 17:16:05.968121', '_unique_id': '56efe7fc92cd497cb0dccf9ef9326524'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:16:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:16:05.969 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:16:06 compute-0 podman[217595]: 2026-02-27 17:16:06.642830614 +0000 UTC m=+0.050632961 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 27 17:16:06 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:06.951 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:16:06 compute-0 nova_compute[186840]: 2026-02-27 17:16:06.951 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:06 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:06.952 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:16:07 compute-0 nova_compute[186840]: 2026-02-27 17:16:07.679 186844 DEBUG nova.network.neutron [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Successfully created port: 661b2c01-df47-4eda-960b-1f78dd4261ee _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:16:07 compute-0 nova_compute[186840]: 2026-02-27 17:16:07.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:16:07 compute-0 nova_compute[186840]: 2026-02-27 17:16:07.830 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:08 compute-0 nova_compute[186840]: 2026-02-27 17:16:08.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:16:09 compute-0 podman[217614]: 2026-02-27 17:16:09.662585661 +0000 UTC m=+0.072038810 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 27 17:16:09 compute-0 nova_compute[186840]: 2026-02-27 17:16:09.732 186844 DEBUG nova.network.neutron [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Successfully updated port: 661b2c01-df47-4eda-960b-1f78dd4261ee _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:16:09 compute-0 nova_compute[186840]: 2026-02-27 17:16:09.753 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-b575a09b-b52a-4b4f-94f6-b3b96997b910" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:16:09 compute-0 nova_compute[186840]: 2026-02-27 17:16:09.754 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-b575a09b-b52a-4b4f-94f6-b3b96997b910" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:16:09 compute-0 nova_compute[186840]: 2026-02-27 17:16:09.754 186844 DEBUG nova.network.neutron [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:16:09 compute-0 nova_compute[186840]: 2026-02-27 17:16:09.839 186844 DEBUG nova.compute.manager [req-3e91423e-f993-4a5b-b0b1-ccc2299e69ca req-61c2b39d-b54e-4e40-a652-fb6dc4526bc1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Received event network-changed-661b2c01-df47-4eda-960b-1f78dd4261ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:16:09 compute-0 nova_compute[186840]: 2026-02-27 17:16:09.840 186844 DEBUG nova.compute.manager [req-3e91423e-f993-4a5b-b0b1-ccc2299e69ca req-61c2b39d-b54e-4e40-a652-fb6dc4526bc1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Refreshing instance network info cache due to event network-changed-661b2c01-df47-4eda-960b-1f78dd4261ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:16:09 compute-0 nova_compute[186840]: 2026-02-27 17:16:09.840 186844 DEBUG oslo_concurrency.lockutils [req-3e91423e-f993-4a5b-b0b1-ccc2299e69ca req-61c2b39d-b54e-4e40-a652-fb6dc4526bc1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-b575a09b-b52a-4b4f-94f6-b3b96997b910" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:16:09 compute-0 nova_compute[186840]: 2026-02-27 17:16:09.943 186844 DEBUG nova.network.neutron [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:16:10 compute-0 nova_compute[186840]: 2026-02-27 17:16:10.430 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:10 compute-0 nova_compute[186840]: 2026-02-27 17:16:10.694 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.151 186844 DEBUG nova.network.neutron [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Updating instance_info_cache with network_info: [{"id": "661b2c01-df47-4eda-960b-1f78dd4261ee", "address": "fa:16:3e:6a:97:65", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661b2c01-df", "ovs_interfaceid": "661b2c01-df47-4eda-960b-1f78dd4261ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.174 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-b575a09b-b52a-4b4f-94f6-b3b96997b910" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.175 186844 DEBUG nova.compute.manager [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Instance network_info: |[{"id": "661b2c01-df47-4eda-960b-1f78dd4261ee", "address": "fa:16:3e:6a:97:65", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661b2c01-df", "ovs_interfaceid": "661b2c01-df47-4eda-960b-1f78dd4261ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.176 186844 DEBUG oslo_concurrency.lockutils [req-3e91423e-f993-4a5b-b0b1-ccc2299e69ca req-61c2b39d-b54e-4e40-a652-fb6dc4526bc1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-b575a09b-b52a-4b4f-94f6-b3b96997b910" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.176 186844 DEBUG nova.network.neutron [req-3e91423e-f993-4a5b-b0b1-ccc2299e69ca req-61c2b39d-b54e-4e40-a652-fb6dc4526bc1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Refreshing network info cache for port 661b2c01-df47-4eda-960b-1f78dd4261ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.181 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Start _get_guest_xml network_info=[{"id": "661b2c01-df47-4eda-960b-1f78dd4261ee", "address": "fa:16:3e:6a:97:65", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661b2c01-df", "ovs_interfaceid": "661b2c01-df47-4eda-960b-1f78dd4261ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.187 186844 WARNING nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.205 186844 DEBUG nova.virt.libvirt.host [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.206 186844 DEBUG nova.virt.libvirt.host [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.213 186844 DEBUG nova.virt.libvirt.host [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.214 186844 DEBUG nova.virt.libvirt.host [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.214 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.215 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.216 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.216 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.217 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.217 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.218 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.218 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.219 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.219 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.219 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.220 186844 DEBUG nova.virt.hardware [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.226 186844 DEBUG nova.virt.libvirt.vif [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:16:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1492352204',display_name='tempest-TestNetworkBasicOps-server-1492352204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1492352204',id=5,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC42O36no/AfMvx9cszQEqvOTaxx+cbzLug8bKy/SuJERdLxi+aYHMfD11hgKt0gLPR36MMu3KjZzmIDsVmnaZ0GhX4WzbIfMqTmoAuVA0GydURCH0Y2JHvTKmKNFzUZBg==',key_name='tempest-TestNetworkBasicOps-1534514246',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-8g6kx1jj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:16:04Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=b575a09b-b52a-4b4f-94f6-b3b96997b910,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "661b2c01-df47-4eda-960b-1f78dd4261ee", "address": "fa:16:3e:6a:97:65", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661b2c01-df", "ovs_interfaceid": "661b2c01-df47-4eda-960b-1f78dd4261ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.226 186844 DEBUG nova.network.os_vif_util [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "661b2c01-df47-4eda-960b-1f78dd4261ee", "address": "fa:16:3e:6a:97:65", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661b2c01-df", "ovs_interfaceid": "661b2c01-df47-4eda-960b-1f78dd4261ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.228 186844 DEBUG nova.network.os_vif_util [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:97:65,bridge_name='br-int',has_traffic_filtering=True,id=661b2c01-df47-4eda-960b-1f78dd4261ee,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661b2c01-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.229 186844 DEBUG nova.objects.instance [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid b575a09b-b52a-4b4f-94f6-b3b96997b910 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.253 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:16:11 compute-0 nova_compute[186840]:   <uuid>b575a09b-b52a-4b4f-94f6-b3b96997b910</uuid>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   <name>instance-00000005</name>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1492352204</nova:name>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:16:11</nova:creationTime>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:16:11 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:16:11 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:16:11 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:16:11 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:16:11 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:16:11 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:16:11 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:16:11 compute-0 nova_compute[186840]:         <nova:port uuid="661b2c01-df47-4eda-960b-1f78dd4261ee">
Feb 27 17:16:11 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <system>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <entry name="serial">b575a09b-b52a-4b4f-94f6-b3b96997b910</entry>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <entry name="uuid">b575a09b-b52a-4b4f-94f6-b3b96997b910</entry>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     </system>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   <os>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   </os>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   <features>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   </features>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk.config"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:6a:97:65"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <target dev="tap661b2c01-df"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/console.log" append="off"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <video>
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     </video>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:16:11 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:16:11 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:16:11 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:16:11 compute-0 nova_compute[186840]: </domain>
Feb 27 17:16:11 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.254 186844 DEBUG nova.compute.manager [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Preparing to wait for external event network-vif-plugged-661b2c01-df47-4eda-960b-1f78dd4261ee prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.254 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.255 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.255 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.256 186844 DEBUG nova.virt.libvirt.vif [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:16:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1492352204',display_name='tempest-TestNetworkBasicOps-server-1492352204',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1492352204',id=5,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC42O36no/AfMvx9cszQEqvOTaxx+cbzLug8bKy/SuJERdLxi+aYHMfD11hgKt0gLPR36MMu3KjZzmIDsVmnaZ0GhX4WzbIfMqTmoAuVA0GydURCH0Y2JHvTKmKNFzUZBg==',key_name='tempest-TestNetworkBasicOps-1534514246',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-8g6kx1jj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:16:04Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=b575a09b-b52a-4b4f-94f6-b3b96997b910,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "661b2c01-df47-4eda-960b-1f78dd4261ee", "address": "fa:16:3e:6a:97:65", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661b2c01-df", "ovs_interfaceid": "661b2c01-df47-4eda-960b-1f78dd4261ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.257 186844 DEBUG nova.network.os_vif_util [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "661b2c01-df47-4eda-960b-1f78dd4261ee", "address": "fa:16:3e:6a:97:65", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661b2c01-df", "ovs_interfaceid": "661b2c01-df47-4eda-960b-1f78dd4261ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.258 186844 DEBUG nova.network.os_vif_util [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6a:97:65,bridge_name='br-int',has_traffic_filtering=True,id=661b2c01-df47-4eda-960b-1f78dd4261ee,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661b2c01-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.258 186844 DEBUG os_vif [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:97:65,bridge_name='br-int',has_traffic_filtering=True,id=661b2c01-df47-4eda-960b-1f78dd4261ee,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661b2c01-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.259 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.260 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.260 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.264 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.264 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap661b2c01-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.265 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap661b2c01-df, col_values=(('external_ids', {'iface-id': '661b2c01-df47-4eda-960b-1f78dd4261ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6a:97:65', 'vm-uuid': 'b575a09b-b52a-4b4f-94f6-b3b96997b910'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.267 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:11 compute-0 NetworkManager[56537]: <info>  [1772212571.2688] manager: (tap661b2c01-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.269 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.277 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.278 186844 INFO os_vif [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6a:97:65,bridge_name='br-int',has_traffic_filtering=True,id=661b2c01-df47-4eda-960b-1f78dd4261ee,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661b2c01-df')
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.373 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.373 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.374 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:6a:97:65, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.375 186844 INFO nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Using config drive
Feb 27 17:16:11 compute-0 podman[217643]: 2026-02-27 17:16:11.39741086 +0000 UTC m=+0.078972668 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, version=9.7, name=ubi9/ubi-minimal, vcs-type=git, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.729 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.729 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.730 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.730 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.769 186844 INFO nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Creating config drive at /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk.config
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.775 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpra2r5d8m execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.825 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.900 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.901 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.918 186844 DEBUG oslo_concurrency.processutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpra2r5d8m" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:16:11 compute-0 kernel: tap661b2c01-df: entered promiscuous mode
Feb 27 17:16:11 compute-0 NetworkManager[56537]: <info>  [1772212571.9717] manager: (tap661b2c01-df): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Feb 27 17:16:11 compute-0 ovn_controller[96756]: 2026-02-27T17:16:11Z|00072|binding|INFO|Claiming lport 661b2c01-df47-4eda-960b-1f78dd4261ee for this chassis.
Feb 27 17:16:11 compute-0 ovn_controller[96756]: 2026-02-27T17:16:11Z|00073|binding|INFO|661b2c01-df47-4eda-960b-1f78dd4261ee: Claiming fa:16:3e:6a:97:65 10.100.0.9
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.974 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:11 compute-0 ovn_controller[96756]: 2026-02-27T17:16:11Z|00074|binding|INFO|Setting lport 661b2c01-df47-4eda-960b-1f78dd4261ee ovn-installed in OVS
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.981 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:11 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:11.984 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:97:65 10.100.0.9'], port_security=['fa:16:3e:6a:97:65 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b575a09b-b52a-4b4f-94f6-b3b96997b910', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '40516b26-10aa-4e25-a5e3-d38ce7bbfd39', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c0ac065-8e7b-4bf9-8813-be70fe636457, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=661b2c01-df47-4eda-960b-1f78dd4261ee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:16:11 compute-0 ovn_controller[96756]: 2026-02-27T17:16:11Z|00075|binding|INFO|Setting lport 661b2c01-df47-4eda-960b-1f78dd4261ee up in Southbound
Feb 27 17:16:11 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:11.986 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 661b2c01-df47-4eda-960b-1f78dd4261ee in datapath 20273fc4-3c4b-4ae3-a6ed-448130c129da bound to our chassis
Feb 27 17:16:11 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:11.987 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20273fc4-3c4b-4ae3-a6ed-448130c129da
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.989 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:11 compute-0 nova_compute[186840]: 2026-02-27 17:16:11.993 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.004 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:16:12 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:12.007 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[1beb4531-4544-4ae8-9d6f-101fbb0456cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:12 compute-0 systemd-machined[156136]: New machine qemu-5-instance-00000005.
Feb 27 17:16:12 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000005.
Feb 27 17:16:12 compute-0 systemd-udevd[217692]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:16:12 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:12.040 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[514f191e-c8a3-488a-ba0f-7681d0d8917d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:12 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:12.049 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[3eab47cd-0066-44a9-9f39-7c2c5b96cdc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:12 compute-0 NetworkManager[56537]: <info>  [1772212572.0554] device (tap661b2c01-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:16:12 compute-0 NetworkManager[56537]: <info>  [1772212572.0562] device (tap661b2c01-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:16:12 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:12.078 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[4f84f901-a8ab-4698-8385-53c73f0ad838]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.085 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.087 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:16:12 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:12.097 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[b061ca45-0e03-4fe4-8516-3397758fe13e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20273fc4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:17:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340950, 'reachable_time': 37921, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217704, 'error': None, 'target': 'ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:12 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:12.112 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[6168e8d2-70e3-49ab-8521-f8fdd96b693a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap20273fc4-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 340959, 'tstamp': 340959}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217707, 'error': None, 'target': 'ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap20273fc4-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 340962, 'tstamp': 340962}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217707, 'error': None, 'target': 'ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:12 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:12.113 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20273fc4-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.115 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.116 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:12 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:12.117 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20273fc4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:12 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:12.117 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:16:12 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:12.117 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20273fc4-30, col_values=(('external_ids', {'iface-id': '16bd8d62-0b43-4a50-9c21-389d82182e3f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:12 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:12.118 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.155 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.244 186844 DEBUG nova.compute.manager [req-700419e6-2477-437f-aea0-699fc0d3abaa req-9543cb68-5c6f-4444-b33e-d82c0269dd42 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Received event network-vif-plugged-661b2c01-df47-4eda-960b-1f78dd4261ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.248 186844 DEBUG oslo_concurrency.lockutils [req-700419e6-2477-437f-aea0-699fc0d3abaa req-9543cb68-5c6f-4444-b33e-d82c0269dd42 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.248 186844 DEBUG oslo_concurrency.lockutils [req-700419e6-2477-437f-aea0-699fc0d3abaa req-9543cb68-5c6f-4444-b33e-d82c0269dd42 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.248 186844 DEBUG oslo_concurrency.lockutils [req-700419e6-2477-437f-aea0-699fc0d3abaa req-9543cb68-5c6f-4444-b33e-d82c0269dd42 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.248 186844 DEBUG nova.compute.manager [req-700419e6-2477-437f-aea0-699fc0d3abaa req-9543cb68-5c6f-4444-b33e-d82c0269dd42 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Processing event network-vif-plugged-661b2c01-df47-4eda-960b-1f78dd4261ee _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.299 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212572.2993855, b575a09b-b52a-4b4f-94f6-b3b96997b910 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.300 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] VM Started (Lifecycle Event)
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.302 186844 DEBUG nova.compute.manager [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.306 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.309 186844 INFO nova.virt.libvirt.driver [-] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Instance spawned successfully.
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.309 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.321 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.326 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.329 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.329 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.329 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.330 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.330 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.331 186844 DEBUG nova.virt.libvirt.driver [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.353 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.353 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212572.299761, b575a09b-b52a-4b4f-94f6-b3b96997b910 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.353 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] VM Paused (Lifecycle Event)
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.372 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.376 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212572.3047438, b575a09b-b52a-4b4f-94f6-b3b96997b910 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.376 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] VM Resumed (Lifecycle Event)
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.382 186844 INFO nova.compute.manager [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Took 7.45 seconds to spawn the instance on the hypervisor.
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.382 186844 DEBUG nova.compute.manager [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.392 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.393 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.393 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5608MB free_disk=73.16492080688477GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.394 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.394 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.396 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.438 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.466 186844 INFO nova.compute.manager [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Took 7.99 seconds to build instance.
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.487 186844 DEBUG oslo_concurrency.lockutils [None req-902f5e32-6b18-44f1-b6fd-6ca52ee168d8 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.065s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.520 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Instance 0b855bd8-86fd-4396-a811-1630f320c70a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.520 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Instance b575a09b-b52a-4b4f-94f6-b3b96997b910 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.520 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.521 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.598 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.622 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.652 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.653 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.876 186844 DEBUG nova.network.neutron [req-3e91423e-f993-4a5b-b0b1-ccc2299e69ca req-61c2b39d-b54e-4e40-a652-fb6dc4526bc1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Updated VIF entry in instance network info cache for port 661b2c01-df47-4eda-960b-1f78dd4261ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.877 186844 DEBUG nova.network.neutron [req-3e91423e-f993-4a5b-b0b1-ccc2299e69ca req-61c2b39d-b54e-4e40-a652-fb6dc4526bc1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Updating instance_info_cache with network_info: [{"id": "661b2c01-df47-4eda-960b-1f78dd4261ee", "address": "fa:16:3e:6a:97:65", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661b2c01-df", "ovs_interfaceid": "661b2c01-df47-4eda-960b-1f78dd4261ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:16:12 compute-0 nova_compute[186840]: 2026-02-27 17:16:12.909 186844 DEBUG oslo_concurrency.lockutils [req-3e91423e-f993-4a5b-b0b1-ccc2299e69ca req-61c2b39d-b54e-4e40-a652-fb6dc4526bc1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-b575a09b-b52a-4b4f-94f6-b3b96997b910" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:16:13 compute-0 nova_compute[186840]: 2026-02-27 17:16:13.654 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:16:13 compute-0 nova_compute[186840]: 2026-02-27 17:16:13.655 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:16:13 compute-0 nova_compute[186840]: 2026-02-27 17:16:13.655 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:16:13 compute-0 nova_compute[186840]: 2026-02-27 17:16:13.844 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:16:13 compute-0 nova_compute[186840]: 2026-02-27 17:16:13.844 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquired lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:16:13 compute-0 nova_compute[186840]: 2026-02-27 17:16:13.845 186844 DEBUG nova.network.neutron [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 27 17:16:13 compute-0 nova_compute[186840]: 2026-02-27 17:16:13.845 186844 DEBUG nova.objects.instance [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0b855bd8-86fd-4396-a811-1630f320c70a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:16:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:13.955 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:14 compute-0 nova_compute[186840]: 2026-02-27 17:16:14.378 186844 DEBUG nova.compute.manager [req-32a82add-302d-4a9c-91c0-e608c0ba4935 req-f9b5744f-910d-47ba-a6c6-a8135af92898 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Received event network-vif-plugged-661b2c01-df47-4eda-960b-1f78dd4261ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:16:14 compute-0 nova_compute[186840]: 2026-02-27 17:16:14.379 186844 DEBUG oslo_concurrency.lockutils [req-32a82add-302d-4a9c-91c0-e608c0ba4935 req-f9b5744f-910d-47ba-a6c6-a8135af92898 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:14 compute-0 nova_compute[186840]: 2026-02-27 17:16:14.379 186844 DEBUG oslo_concurrency.lockutils [req-32a82add-302d-4a9c-91c0-e608c0ba4935 req-f9b5744f-910d-47ba-a6c6-a8135af92898 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:14 compute-0 nova_compute[186840]: 2026-02-27 17:16:14.380 186844 DEBUG oslo_concurrency.lockutils [req-32a82add-302d-4a9c-91c0-e608c0ba4935 req-f9b5744f-910d-47ba-a6c6-a8135af92898 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:14 compute-0 nova_compute[186840]: 2026-02-27 17:16:14.380 186844 DEBUG nova.compute.manager [req-32a82add-302d-4a9c-91c0-e608c0ba4935 req-f9b5744f-910d-47ba-a6c6-a8135af92898 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] No waiting events found dispatching network-vif-plugged-661b2c01-df47-4eda-960b-1f78dd4261ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:16:14 compute-0 nova_compute[186840]: 2026-02-27 17:16:14.381 186844 WARNING nova.compute.manager [req-32a82add-302d-4a9c-91c0-e608c0ba4935 req-f9b5744f-910d-47ba-a6c6-a8135af92898 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Received unexpected event network-vif-plugged-661b2c01-df47-4eda-960b-1f78dd4261ee for instance with vm_state active and task_state None.
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.044 186844 DEBUG nova.compute.manager [req-9916ca0f-ed32-463f-8271-347584629438 req-feefe152-c48c-450c-9565-4a9deb581946 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Received event network-changed-661b2c01-df47-4eda-960b-1f78dd4261ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.045 186844 DEBUG nova.compute.manager [req-9916ca0f-ed32-463f-8271-347584629438 req-feefe152-c48c-450c-9565-4a9deb581946 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Refreshing instance network info cache due to event network-changed-661b2c01-df47-4eda-960b-1f78dd4261ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.045 186844 DEBUG oslo_concurrency.lockutils [req-9916ca0f-ed32-463f-8271-347584629438 req-feefe152-c48c-450c-9565-4a9deb581946 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-b575a09b-b52a-4b4f-94f6-b3b96997b910" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.046 186844 DEBUG oslo_concurrency.lockutils [req-9916ca0f-ed32-463f-8271-347584629438 req-feefe152-c48c-450c-9565-4a9deb581946 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-b575a09b-b52a-4b4f-94f6-b3b96997b910" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.046 186844 DEBUG nova.network.neutron [req-9916ca0f-ed32-463f-8271-347584629438 req-feefe152-c48c-450c-9565-4a9deb581946 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Refreshing network info cache for port 661b2c01-df47-4eda-960b-1f78dd4261ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.337 186844 DEBUG nova.network.neutron [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Updating instance_info_cache with network_info: [{"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.362 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Releasing lock "refresh_cache-0b855bd8-86fd-4396-a811-1630f320c70a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.362 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.363 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.364 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.364 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.365 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.433 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:15 compute-0 nova_compute[186840]: 2026-02-27 17:16:15.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:16:16 compute-0 nova_compute[186840]: 2026-02-27 17:16:16.268 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:16 compute-0 nova_compute[186840]: 2026-02-27 17:16:16.654 186844 DEBUG nova.network.neutron [req-9916ca0f-ed32-463f-8271-347584629438 req-feefe152-c48c-450c-9565-4a9deb581946 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Updated VIF entry in instance network info cache for port 661b2c01-df47-4eda-960b-1f78dd4261ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:16:16 compute-0 nova_compute[186840]: 2026-02-27 17:16:16.654 186844 DEBUG nova.network.neutron [req-9916ca0f-ed32-463f-8271-347584629438 req-feefe152-c48c-450c-9565-4a9deb581946 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Updating instance_info_cache with network_info: [{"id": "661b2c01-df47-4eda-960b-1f78dd4261ee", "address": "fa:16:3e:6a:97:65", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661b2c01-df", "ovs_interfaceid": "661b2c01-df47-4eda-960b-1f78dd4261ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:16:16 compute-0 nova_compute[186840]: 2026-02-27 17:16:16.686 186844 DEBUG oslo_concurrency.lockutils [req-9916ca0f-ed32-463f-8271-347584629438 req-feefe152-c48c-450c-9565-4a9deb581946 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-b575a09b-b52a-4b4f-94f6-b3b96997b910" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:16:17 compute-0 podman[217717]: 2026-02-27 17:16:17.665449853 +0000 UTC m=+0.072078241 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260223)
Feb 27 17:16:20 compute-0 nova_compute[186840]: 2026-02-27 17:16:20.435 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:21 compute-0 nova_compute[186840]: 2026-02-27 17:16:21.309 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:22 compute-0 podman[217742]: 2026-02-27 17:16:22.666428072 +0000 UTC m=+0.068411792 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:16:24 compute-0 ovn_controller[96756]: 2026-02-27T17:16:24Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6a:97:65 10.100.0.9
Feb 27 17:16:24 compute-0 ovn_controller[96756]: 2026-02-27T17:16:24Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6a:97:65 10.100.0.9
Feb 27 17:16:25 compute-0 nova_compute[186840]: 2026-02-27 17:16:25.437 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:26 compute-0 nova_compute[186840]: 2026-02-27 17:16:26.339 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:30 compute-0 sshd-session[217771]: Connection closed by authenticating user root 104.234.37.243 port 57510 [preauth]
Feb 27 17:16:30 compute-0 nova_compute[186840]: 2026-02-27 17:16:30.441 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:31 compute-0 nova_compute[186840]: 2026-02-27 17:16:31.341 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:31 compute-0 nova_compute[186840]: 2026-02-27 17:16:31.710 186844 INFO nova.compute.manager [None req-8e0ba059-43d4-468e-a5e2-2e996bf7c059 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Get console output
Feb 27 17:16:31 compute-0 nova_compute[186840]: 2026-02-27 17:16:31.717 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.060 186844 DEBUG oslo_concurrency.lockutils [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "b575a09b-b52a-4b4f-94f6-b3b96997b910" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.061 186844 DEBUG oslo_concurrency.lockutils [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.062 186844 DEBUG oslo_concurrency.lockutils [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.062 186844 DEBUG oslo_concurrency.lockutils [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.063 186844 DEBUG oslo_concurrency.lockutils [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.065 186844 INFO nova.compute.manager [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Terminating instance
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.067 186844 DEBUG nova.compute.manager [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:16:32 compute-0 kernel: tap661b2c01-df (unregistering): left promiscuous mode
Feb 27 17:16:32 compute-0 NetworkManager[56537]: <info>  [1772212592.1000] device (tap661b2c01-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.106 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:32 compute-0 ovn_controller[96756]: 2026-02-27T17:16:32Z|00076|binding|INFO|Releasing lport 661b2c01-df47-4eda-960b-1f78dd4261ee from this chassis (sb_readonly=0)
Feb 27 17:16:32 compute-0 ovn_controller[96756]: 2026-02-27T17:16:32Z|00077|binding|INFO|Setting lport 661b2c01-df47-4eda-960b-1f78dd4261ee down in Southbound
Feb 27 17:16:32 compute-0 ovn_controller[96756]: 2026-02-27T17:16:32Z|00078|binding|INFO|Removing iface tap661b2c01-df ovn-installed in OVS
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.111 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.115 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.119 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:97:65 10.100.0.9'], port_security=['fa:16:3e:6a:97:65 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b575a09b-b52a-4b4f-94f6-b3b96997b910', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '40516b26-10aa-4e25-a5e3-d38ce7bbfd39', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c0ac065-8e7b-4bf9-8813-be70fe636457, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=661b2c01-df47-4eda-960b-1f78dd4261ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.122 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 661b2c01-df47-4eda-960b-1f78dd4261ee in datapath 20273fc4-3c4b-4ae3-a6ed-448130c129da unbound from our chassis
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.124 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 20273fc4-3c4b-4ae3-a6ed-448130c129da
Feb 27 17:16:32 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Deactivated successfully.
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.140 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[1c8d57ba-da8f-4073-93e7-2a1f424934db]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:32 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000005.scope: Consumed 12.153s CPU time.
Feb 27 17:16:32 compute-0 systemd-machined[156136]: Machine qemu-5-instance-00000005 terminated.
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.172 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[e615588c-244b-4ec5-9809-f2789ecbc9f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.176 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[90cb4c97-bfe6-4df6-a764-10e5c3ba2161]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.204 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[a2b47f76-02a7-4738-8900-c077ae400ab8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.221 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[f76c19ca-0f2f-4baa-80e4-6a49c030b012]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap20273fc4-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0f:17:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340950, 'reachable_time': 37921, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217786, 'error': None, 'target': 'ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.234 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[35fd537c-4c9a-4ccd-867c-b3b4978eb09c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap20273fc4-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 340959, 'tstamp': 340959}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217787, 'error': None, 'target': 'ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap20273fc4-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 340962, 'tstamp': 340962}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217787, 'error': None, 'target': 'ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.235 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20273fc4-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.237 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.242 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.243 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20273fc4-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.243 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.244 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap20273fc4-30, col_values=(('external_ids', {'iface-id': '16bd8d62-0b43-4a50-9c21-389d82182e3f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:32 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:32.244 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.332 186844 INFO nova.virt.libvirt.driver [-] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Instance destroyed successfully.
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.332 186844 DEBUG nova.objects.instance [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid b575a09b-b52a-4b4f-94f6-b3b96997b910 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.349 186844 DEBUG nova.virt.libvirt.vif [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:16:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1492352204',display_name='tempest-TestNetworkBasicOps-server-1492352204',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1492352204',id=5,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBC42O36no/AfMvx9cszQEqvOTaxx+cbzLug8bKy/SuJERdLxi+aYHMfD11hgKt0gLPR36MMu3KjZzmIDsVmnaZ0GhX4WzbIfMqTmoAuVA0GydURCH0Y2JHvTKmKNFzUZBg==',key_name='tempest-TestNetworkBasicOps-1534514246',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:16:12Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-8g6kx1jj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:16:12Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=b575a09b-b52a-4b4f-94f6-b3b96997b910,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "661b2c01-df47-4eda-960b-1f78dd4261ee", "address": "fa:16:3e:6a:97:65", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661b2c01-df", "ovs_interfaceid": "661b2c01-df47-4eda-960b-1f78dd4261ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.349 186844 DEBUG nova.network.os_vif_util [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "661b2c01-df47-4eda-960b-1f78dd4261ee", "address": "fa:16:3e:6a:97:65", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap661b2c01-df", "ovs_interfaceid": "661b2c01-df47-4eda-960b-1f78dd4261ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.350 186844 DEBUG nova.network.os_vif_util [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6a:97:65,bridge_name='br-int',has_traffic_filtering=True,id=661b2c01-df47-4eda-960b-1f78dd4261ee,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661b2c01-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.351 186844 DEBUG os_vif [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:97:65,bridge_name='br-int',has_traffic_filtering=True,id=661b2c01-df47-4eda-960b-1f78dd4261ee,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661b2c01-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.352 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.353 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap661b2c01-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.355 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.358 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.361 186844 INFO os_vif [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6a:97:65,bridge_name='br-int',has_traffic_filtering=True,id=661b2c01-df47-4eda-960b-1f78dd4261ee,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap661b2c01-df')
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.362 186844 INFO nova.virt.libvirt.driver [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Deleting instance files /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910_del
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.363 186844 INFO nova.virt.libvirt.driver [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Deletion of /var/lib/nova/instances/b575a09b-b52a-4b4f-94f6-b3b96997b910_del complete
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.423 186844 INFO nova.compute.manager [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Took 0.36 seconds to destroy the instance on the hypervisor.
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.424 186844 DEBUG oslo.service.loopingcall [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.424 186844 DEBUG nova.compute.manager [-] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.425 186844 DEBUG nova.network.neutron [-] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.639 186844 DEBUG nova.compute.manager [req-c826bc0c-33d8-427b-bce2-32a069ac5413 req-6720e543-5c80-473f-8206-ec54ba85c5fa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Received event network-vif-unplugged-661b2c01-df47-4eda-960b-1f78dd4261ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.640 186844 DEBUG oslo_concurrency.lockutils [req-c826bc0c-33d8-427b-bce2-32a069ac5413 req-6720e543-5c80-473f-8206-ec54ba85c5fa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.640 186844 DEBUG oslo_concurrency.lockutils [req-c826bc0c-33d8-427b-bce2-32a069ac5413 req-6720e543-5c80-473f-8206-ec54ba85c5fa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.641 186844 DEBUG oslo_concurrency.lockutils [req-c826bc0c-33d8-427b-bce2-32a069ac5413 req-6720e543-5c80-473f-8206-ec54ba85c5fa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.641 186844 DEBUG nova.compute.manager [req-c826bc0c-33d8-427b-bce2-32a069ac5413 req-6720e543-5c80-473f-8206-ec54ba85c5fa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] No waiting events found dispatching network-vif-unplugged-661b2c01-df47-4eda-960b-1f78dd4261ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:16:32 compute-0 nova_compute[186840]: 2026-02-27 17:16:32.641 186844 DEBUG nova.compute.manager [req-c826bc0c-33d8-427b-bce2-32a069ac5413 req-6720e543-5c80-473f-8206-ec54ba85c5fa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Received event network-vif-unplugged-661b2c01-df47-4eda-960b-1f78dd4261ee for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.019 186844 DEBUG nova.network.neutron [-] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.046 186844 INFO nova.compute.manager [-] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Took 1.62 seconds to deallocate network for instance.
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.109 186844 DEBUG oslo_concurrency.lockutils [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.110 186844 DEBUG oslo_concurrency.lockutils [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.116 186844 DEBUG nova.compute.manager [req-1ff18f3b-f2f8-400b-8e2d-fb989be4e9a5 req-660cca47-7ed0-4b21-bb72-c95f2184a8ef 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Received event network-vif-deleted-661b2c01-df47-4eda-960b-1f78dd4261ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.210 186844 DEBUG nova.compute.provider_tree [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.235 186844 DEBUG nova.scheduler.client.report [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.270 186844 DEBUG oslo_concurrency.lockutils [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.302 186844 INFO nova.scheduler.client.report [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance b575a09b-b52a-4b4f-94f6-b3b96997b910
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.385 186844 DEBUG oslo_concurrency.lockutils [None req-cd8fb272-7043-445f-926d-d17dc649c9de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.324s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:34 compute-0 podman[217806]: 2026-02-27 17:16:34.66814639 +0000 UTC m=+0.070382610 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.748 186844 DEBUG nova.compute.manager [req-a0fdf59c-cc87-4160-9cac-b75af1d27c98 req-90f28fc4-f43a-4f84-a13a-af753897bc86 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Received event network-vif-plugged-661b2c01-df47-4eda-960b-1f78dd4261ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.749 186844 DEBUG oslo_concurrency.lockutils [req-a0fdf59c-cc87-4160-9cac-b75af1d27c98 req-90f28fc4-f43a-4f84-a13a-af753897bc86 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.750 186844 DEBUG oslo_concurrency.lockutils [req-a0fdf59c-cc87-4160-9cac-b75af1d27c98 req-90f28fc4-f43a-4f84-a13a-af753897bc86 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.750 186844 DEBUG oslo_concurrency.lockutils [req-a0fdf59c-cc87-4160-9cac-b75af1d27c98 req-90f28fc4-f43a-4f84-a13a-af753897bc86 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "b575a09b-b52a-4b4f-94f6-b3b96997b910-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.751 186844 DEBUG nova.compute.manager [req-a0fdf59c-cc87-4160-9cac-b75af1d27c98 req-90f28fc4-f43a-4f84-a13a-af753897bc86 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] No waiting events found dispatching network-vif-plugged-661b2c01-df47-4eda-960b-1f78dd4261ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:16:34 compute-0 nova_compute[186840]: 2026-02-27 17:16:34.751 186844 WARNING nova.compute.manager [req-a0fdf59c-cc87-4160-9cac-b75af1d27c98 req-90f28fc4-f43a-4f84-a13a-af753897bc86 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Received unexpected event network-vif-plugged-661b2c01-df47-4eda-960b-1f78dd4261ee for instance with vm_state deleted and task_state None.
Feb 27 17:16:35 compute-0 nova_compute[186840]: 2026-02-27 17:16:35.444 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.530 186844 DEBUG oslo_concurrency.lockutils [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "0b855bd8-86fd-4396-a811-1630f320c70a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.531 186844 DEBUG oslo_concurrency.lockutils [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.532 186844 DEBUG oslo_concurrency.lockutils [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.532 186844 DEBUG oslo_concurrency.lockutils [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.532 186844 DEBUG oslo_concurrency.lockutils [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.533 186844 INFO nova.compute.manager [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Terminating instance
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.534 186844 DEBUG nova.compute.manager [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:16:36 compute-0 kernel: tap50205493-2b (unregistering): left promiscuous mode
Feb 27 17:16:36 compute-0 NetworkManager[56537]: <info>  [1772212596.5573] device (tap50205493-2b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:16:36 compute-0 ovn_controller[96756]: 2026-02-27T17:16:36Z|00079|binding|INFO|Releasing lport 50205493-2beb-456a-b247-be275b820e6f from this chassis (sb_readonly=0)
Feb 27 17:16:36 compute-0 ovn_controller[96756]: 2026-02-27T17:16:36Z|00080|binding|INFO|Setting lport 50205493-2beb-456a-b247-be275b820e6f down in Southbound
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.599 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:36 compute-0 ovn_controller[96756]: 2026-02-27T17:16:36Z|00081|binding|INFO|Removing iface tap50205493-2b ovn-installed in OVS
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.601 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.604 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.607 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8a:58:7c 10.100.0.13'], port_security=['fa:16:3e:8a:58:7c 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '0b855bd8-86fd-4396-a811-1630f320c70a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9fb6c0dc-f5cc-4283-9d29-8d3151bcfacb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c0ac065-8e7b-4bf9-8813-be70fe636457, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=50205493-2beb-456a-b247-be275b820e6f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.608 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 50205493-2beb-456a-b247-be275b820e6f in datapath 20273fc4-3c4b-4ae3-a6ed-448130c129da unbound from our chassis
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.609 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 20273fc4-3c4b-4ae3-a6ed-448130c129da, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.610 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3b34925f-08d9-49e2-98eb-9ce94614d827]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.610 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da namespace which is not needed anymore
Feb 27 17:16:36 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Feb 27 17:16:36 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 13.722s CPU time.
Feb 27 17:16:36 compute-0 systemd-machined[156136]: Machine qemu-4-instance-00000004 terminated.
Feb 27 17:16:36 compute-0 podman[217853]: 2026-02-27 17:16:36.723637686 +0000 UTC m=+0.050046986 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223)
Feb 27 17:16:36 compute-0 neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da[217408]: [NOTICE]   (217412) : haproxy version is 2.8.14-c23fe91
Feb 27 17:16:36 compute-0 neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da[217408]: [NOTICE]   (217412) : path to executable is /usr/sbin/haproxy
Feb 27 17:16:36 compute-0 neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da[217408]: [WARNING]  (217412) : Exiting Master process...
Feb 27 17:16:36 compute-0 neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da[217408]: [WARNING]  (217412) : Exiting Master process...
Feb 27 17:16:36 compute-0 neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da[217408]: [ALERT]    (217412) : Current worker (217414) exited with code 143 (Terminated)
Feb 27 17:16:36 compute-0 neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da[217408]: [WARNING]  (217412) : All workers exited. Exiting... (0)
Feb 27 17:16:36 compute-0 systemd[1]: libpod-ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41.scope: Deactivated successfully.
Feb 27 17:16:36 compute-0 podman[217862]: 2026-02-27 17:16:36.738938948 +0000 UTC m=+0.052302941 container died ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:16:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41-userdata-shm.mount: Deactivated successfully.
Feb 27 17:16:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-08431d861e6e6240a3cf263904fd0fbf2131bc25c28f393eb5a233512bdd411b-merged.mount: Deactivated successfully.
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.786 186844 INFO nova.virt.libvirt.driver [-] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Instance destroyed successfully.
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.787 186844 DEBUG nova.objects.instance [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid 0b855bd8-86fd-4396-a811-1630f320c70a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:16:36 compute-0 podman[217862]: 2026-02-27 17:16:36.802285096 +0000 UTC m=+0.115649019 container cleanup ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, tcib_managed=true)
Feb 27 17:16:36 compute-0 systemd[1]: libpod-conmon-ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41.scope: Deactivated successfully.
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.809 186844 DEBUG nova.virt.libvirt.vif [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:15:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1739984776',display_name='tempest-TestNetworkBasicOps-server-1739984776',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1739984776',id=4,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHroL+RIEyO1BCGLiULgyrlRVrkJ2/jG+ilxFYWubA9ICDFMAn6k7ZewT9gI/rK4tbB6inoPxZIT4kIvecMy58s6ETmSEJvSjqWBiGY4HHC3543wLVK/wxyF1iE5ZBmkzw==',key_name='tempest-TestNetworkBasicOps-1006816395',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:15:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-1zjsa5ww',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:15:34Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=0b855bd8-86fd-4396-a811-1630f320c70a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.809 186844 DEBUG nova.network.os_vif_util [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "50205493-2beb-456a-b247-be275b820e6f", "address": "fa:16:3e:8a:58:7c", "network": {"id": "20273fc4-3c4b-4ae3-a6ed-448130c129da", "bridge": "br-int", "label": "tempest-network-smoke--325591318", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50205493-2b", "ovs_interfaceid": "50205493-2beb-456a-b247-be275b820e6f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.810 186844 DEBUG nova.network.os_vif_util [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8a:58:7c,bridge_name='br-int',has_traffic_filtering=True,id=50205493-2beb-456a-b247-be275b820e6f,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50205493-2b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.810 186844 DEBUG os_vif [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:58:7c,bridge_name='br-int',has_traffic_filtering=True,id=50205493-2beb-456a-b247-be275b820e6f,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50205493-2b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.812 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.812 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50205493-2b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.813 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.816 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.818 186844 INFO os_vif [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8a:58:7c,bridge_name='br-int',has_traffic_filtering=True,id=50205493-2beb-456a-b247-be275b820e6f,network=Network(20273fc4-3c4b-4ae3-a6ed-448130c129da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50205493-2b')
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.819 186844 INFO nova.virt.libvirt.driver [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Deleting instance files /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a_del
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.819 186844 INFO nova.virt.libvirt.driver [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Deletion of /var/lib/nova/instances/0b855bd8-86fd-4396-a811-1630f320c70a_del complete
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.884 186844 INFO nova.compute.manager [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Took 0.35 seconds to destroy the instance on the hypervisor.
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.885 186844 DEBUG oslo.service.loopingcall [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.885 186844 DEBUG nova.compute.manager [-] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.885 186844 DEBUG nova.network.neutron [-] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:16:36 compute-0 podman[217923]: 2026-02-27 17:16:36.888983361 +0000 UTC m=+0.069448348 container remove ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.892 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[e47960f5-7de0-4014-8001-ce8fe741a1fa]: (4, ('Fri Feb 27 05:16:36 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da (ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41)\nab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41\nFri Feb 27 05:16:36 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da (ab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41)\nab5a9b52ef3b1f3873c07d2f9be13372c028af300ccb8789d7ac033dec1d0c41\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.894 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[1816a14e-fa83-4f62-96de-53f44ee9b843]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.895 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20273fc4-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.897 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:36 compute-0 kernel: tap20273fc4-30: left promiscuous mode
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.902 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[98bc2806-ebad-4482-9de0-cfa802f07c8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:36 compute-0 nova_compute[186840]: 2026-02-27 17:16:36.904 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.924 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[91f7c0ce-85c3-4a41-a04b-281270abbccd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.925 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[8aa2c32b-427b-4c02-b176-eb9acde1ccd7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.942 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[9b7e7caf-6ee4-4b55-885d-7e194ca17896]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 340942, 'reachable_time': 32556, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217938, 'error': None, 'target': 'ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.944 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-20273fc4-3c4b-4ae3-a6ed-448130c129da deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:16:36 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:36.944 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[aeec5609-f19c-486b-923a-5e93db44a952]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:16:36 compute-0 systemd[1]: run-netns-ovnmeta\x2d20273fc4\x2d3c4b\x2d4ae3\x2da6ed\x2d448130c129da.mount: Deactivated successfully.
Feb 27 17:16:40 compute-0 nova_compute[186840]: 2026-02-27 17:16:40.446 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:40 compute-0 podman[217939]: 2026-02-27 17:16:40.679141653 +0000 UTC m=+0.086997454 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.312 186844 DEBUG nova.compute.manager [req-c8bb1cda-05e2-4937-af67-c16bb854348c req-1e46f446-1f59-46ab-87fc-9607521d41a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Received event network-vif-unplugged-50205493-2beb-456a-b247-be275b820e6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.312 186844 DEBUG oslo_concurrency.lockutils [req-c8bb1cda-05e2-4937-af67-c16bb854348c req-1e46f446-1f59-46ab-87fc-9607521d41a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.313 186844 DEBUG oslo_concurrency.lockutils [req-c8bb1cda-05e2-4937-af67-c16bb854348c req-1e46f446-1f59-46ab-87fc-9607521d41a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.313 186844 DEBUG oslo_concurrency.lockutils [req-c8bb1cda-05e2-4937-af67-c16bb854348c req-1e46f446-1f59-46ab-87fc-9607521d41a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.314 186844 DEBUG nova.compute.manager [req-c8bb1cda-05e2-4937-af67-c16bb854348c req-1e46f446-1f59-46ab-87fc-9607521d41a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] No waiting events found dispatching network-vif-unplugged-50205493-2beb-456a-b247-be275b820e6f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.314 186844 DEBUG nova.compute.manager [req-c8bb1cda-05e2-4937-af67-c16bb854348c req-1e46f446-1f59-46ab-87fc-9607521d41a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Received event network-vif-unplugged-50205493-2beb-456a-b247-be275b820e6f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.466 186844 DEBUG nova.network.neutron [-] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.497 186844 INFO nova.compute.manager [-] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Took 4.61 seconds to deallocate network for instance.
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.550 186844 DEBUG nova.compute.manager [req-369216ef-fc00-4d45-ac5b-9eb3b9b80440 req-1b36cfe5-ad86-4357-b4bf-82db08dc7d2e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Received event network-vif-deleted-50205493-2beb-456a-b247-be275b820e6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.553 186844 DEBUG oslo_concurrency.lockutils [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.554 186844 DEBUG oslo_concurrency.lockutils [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.627 186844 DEBUG nova.compute.provider_tree [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.648 186844 DEBUG nova.scheduler.client.report [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:16:41 compute-0 podman[217965]: 2026-02-27 17:16:41.680463604 +0000 UTC m=+0.078944108 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.683 186844 DEBUG oslo_concurrency.lockutils [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.708 186844 INFO nova.scheduler.client.report [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance 0b855bd8-86fd-4396-a811-1630f320c70a
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.796 186844 DEBUG oslo_concurrency.lockutils [None req-3f92c01a-9f93-4bed-9795-7ead27e63f37 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.265s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:41 compute-0 nova_compute[186840]: 2026-02-27 17:16:41.815 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:43 compute-0 nova_compute[186840]: 2026-02-27 17:16:43.426 186844 DEBUG nova.compute.manager [req-32cc3fbb-e369-42b7-a376-f1d2b8a2b0c7 req-908a89f8-9fc4-436c-819a-d211b6371c97 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Received event network-vif-plugged-50205493-2beb-456a-b247-be275b820e6f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:16:43 compute-0 nova_compute[186840]: 2026-02-27 17:16:43.427 186844 DEBUG oslo_concurrency.lockutils [req-32cc3fbb-e369-42b7-a376-f1d2b8a2b0c7 req-908a89f8-9fc4-436c-819a-d211b6371c97 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:43 compute-0 nova_compute[186840]: 2026-02-27 17:16:43.427 186844 DEBUG oslo_concurrency.lockutils [req-32cc3fbb-e369-42b7-a376-f1d2b8a2b0c7 req-908a89f8-9fc4-436c-819a-d211b6371c97 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:43 compute-0 nova_compute[186840]: 2026-02-27 17:16:43.428 186844 DEBUG oslo_concurrency.lockutils [req-32cc3fbb-e369-42b7-a376-f1d2b8a2b0c7 req-908a89f8-9fc4-436c-819a-d211b6371c97 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "0b855bd8-86fd-4396-a811-1630f320c70a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:43 compute-0 nova_compute[186840]: 2026-02-27 17:16:43.428 186844 DEBUG nova.compute.manager [req-32cc3fbb-e369-42b7-a376-f1d2b8a2b0c7 req-908a89f8-9fc4-436c-819a-d211b6371c97 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] No waiting events found dispatching network-vif-plugged-50205493-2beb-456a-b247-be275b820e6f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:16:43 compute-0 nova_compute[186840]: 2026-02-27 17:16:43.428 186844 WARNING nova.compute.manager [req-32cc3fbb-e369-42b7-a376-f1d2b8a2b0c7 req-908a89f8-9fc4-436c-819a-d211b6371c97 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Received unexpected event network-vif-plugged-50205493-2beb-456a-b247-be275b820e6f for instance with vm_state deleted and task_state None.
Feb 27 17:16:44 compute-0 nova_compute[186840]: 2026-02-27 17:16:44.887 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:44 compute-0 nova_compute[186840]: 2026-02-27 17:16:44.922 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:45 compute-0 nova_compute[186840]: 2026-02-27 17:16:45.486 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:46 compute-0 nova_compute[186840]: 2026-02-27 17:16:46.818 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:47.090 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:16:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:47.090 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:16:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:16:47.091 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:16:47 compute-0 nova_compute[186840]: 2026-02-27 17:16:47.331 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212592.3285356, b575a09b-b52a-4b4f-94f6-b3b96997b910 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:16:47 compute-0 nova_compute[186840]: 2026-02-27 17:16:47.331 186844 INFO nova.compute.manager [-] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] VM Stopped (Lifecycle Event)
Feb 27 17:16:47 compute-0 nova_compute[186840]: 2026-02-27 17:16:47.357 186844 DEBUG nova.compute.manager [None req-a5e37c75-13a5-4566-944c-7a162446024b - - - - - -] [instance: b575a09b-b52a-4b4f-94f6-b3b96997b910] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:16:49 compute-0 podman[217989]: 2026-02-27 17:16:49.553003733 +0000 UTC m=+0.954535427 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0)
Feb 27 17:16:50 compute-0 nova_compute[186840]: 2026-02-27 17:16:50.488 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:51 compute-0 nova_compute[186840]: 2026-02-27 17:16:51.784 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212596.782906, 0b855bd8-86fd-4396-a811-1630f320c70a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:16:51 compute-0 nova_compute[186840]: 2026-02-27 17:16:51.785 186844 INFO nova.compute.manager [-] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] VM Stopped (Lifecycle Event)
Feb 27 17:16:51 compute-0 nova_compute[186840]: 2026-02-27 17:16:51.808 186844 DEBUG nova.compute.manager [None req-843eef80-d05e-40dc-9cb9-74007d9d6a0c - - - - - -] [instance: 0b855bd8-86fd-4396-a811-1630f320c70a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:16:51 compute-0 nova_compute[186840]: 2026-02-27 17:16:51.821 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:53 compute-0 podman[218009]: 2026-02-27 17:16:53.655731302 +0000 UTC m=+0.061285491 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:16:55 compute-0 nova_compute[186840]: 2026-02-27 17:16:55.489 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:16:56 compute-0 nova_compute[186840]: 2026-02-27 17:16:56.825 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:00 compute-0 nova_compute[186840]: 2026-02-27 17:17:00.492 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:01 compute-0 nova_compute[186840]: 2026-02-27 17:17:01.827 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.106 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.106 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.124 186844 DEBUG nova.compute.manager [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.222 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.222 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.231 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.232 186844 INFO nova.compute.claims [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.366 186844 DEBUG nova.compute.provider_tree [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.400 186844 DEBUG nova.scheduler.client.report [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.447 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.448 186844 DEBUG nova.compute.manager [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.527 186844 DEBUG nova.compute.manager [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.527 186844 DEBUG nova.network.neutron [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.565 186844 INFO nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.588 186844 DEBUG nova.compute.manager [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.701 186844 DEBUG nova.compute.manager [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.703 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.704 186844 INFO nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Creating image(s)
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.705 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.706 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.707 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.731 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.804 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.805 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.806 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.829 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.893 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.894 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.973 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk 1073741824" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.975 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:04 compute-0 nova_compute[186840]: 2026-02-27 17:17:04.976 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.050 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.051 186844 DEBUG nova.virt.disk.api [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.051 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.102 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.103 186844 DEBUG nova.virt.disk.api [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.104 186844 DEBUG nova.objects.instance [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid f087df93-6b03-417d-bc8b-7114adfa61a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.121 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.122 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Ensure instance console log exists: /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.123 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.123 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.124 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.494 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:05 compute-0 podman[218048]: 2026-02-27 17:17:05.660338278 +0000 UTC m=+0.065559887 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:17:05 compute-0 nova_compute[186840]: 2026-02-27 17:17:05.738 186844 DEBUG nova.policy [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:17:06 compute-0 nova_compute[186840]: 2026-02-27 17:17:06.830 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:07 compute-0 podman[218072]: 2026-02-27 17:17:07.675874177 +0000 UTC m=+0.068276255 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260223)
Feb 27 17:17:07 compute-0 nova_compute[186840]: 2026-02-27 17:17:07.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:08 compute-0 nova_compute[186840]: 2026-02-27 17:17:08.714 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:08 compute-0 nova_compute[186840]: 2026-02-27 17:17:08.715 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:08 compute-0 nova_compute[186840]: 2026-02-27 17:17:08.849 186844 DEBUG nova.network.neutron [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Successfully created port: ddbe59f7-465a-458f-a721-e3d5d380e6cc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:17:10 compute-0 nova_compute[186840]: 2026-02-27 17:17:10.495 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:11 compute-0 nova_compute[186840]: 2026-02-27 17:17:11.336 186844 DEBUG nova.network.neutron [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Successfully updated port: ddbe59f7-465a-458f-a721-e3d5d380e6cc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:17:11 compute-0 nova_compute[186840]: 2026-02-27 17:17:11.356 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:17:11 compute-0 nova_compute[186840]: 2026-02-27 17:17:11.356 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:17:11 compute-0 nova_compute[186840]: 2026-02-27 17:17:11.357 186844 DEBUG nova.network.neutron [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:17:11 compute-0 nova_compute[186840]: 2026-02-27 17:17:11.450 186844 DEBUG nova.compute.manager [req-c75180ae-9a5c-4873-be81-ed99f4ce3c7d req-768ab782-ed2c-4a48-bf16-20e3ec04192d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-changed-ddbe59f7-465a-458f-a721-e3d5d380e6cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:17:11 compute-0 nova_compute[186840]: 2026-02-27 17:17:11.451 186844 DEBUG nova.compute.manager [req-c75180ae-9a5c-4873-be81-ed99f4ce3c7d req-768ab782-ed2c-4a48-bf16-20e3ec04192d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Refreshing instance network info cache due to event network-changed-ddbe59f7-465a-458f-a721-e3d5d380e6cc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:17:11 compute-0 nova_compute[186840]: 2026-02-27 17:17:11.451 186844 DEBUG oslo_concurrency.lockutils [req-c75180ae-9a5c-4873-be81-ed99f4ce3c7d req-768ab782-ed2c-4a48-bf16-20e3ec04192d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:17:11 compute-0 nova_compute[186840]: 2026-02-27 17:17:11.515 186844 DEBUG nova.network.neutron [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:17:11 compute-0 podman[218091]: 2026-02-27 17:17:11.683010176 +0000 UTC m=+0.087436913 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 27 17:17:11 compute-0 podman[218118]: 2026-02-27 17:17:11.77732254 +0000 UTC m=+0.067180917 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, version=9.7, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, distribution-scope=public)
Feb 27 17:17:11 compute-0 nova_compute[186840]: 2026-02-27 17:17:11.832 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.404 186844 DEBUG nova.network.neutron [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updating instance_info_cache with network_info: [{"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.429 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.429 186844 DEBUG nova.compute.manager [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Instance network_info: |[{"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.430 186844 DEBUG oslo_concurrency.lockutils [req-c75180ae-9a5c-4873-be81-ed99f4ce3c7d req-768ab782-ed2c-4a48-bf16-20e3ec04192d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.431 186844 DEBUG nova.network.neutron [req-c75180ae-9a5c-4873-be81-ed99f4ce3c7d req-768ab782-ed2c-4a48-bf16-20e3ec04192d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Refreshing network info cache for port ddbe59f7-465a-458f-a721-e3d5d380e6cc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.436 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Start _get_guest_xml network_info=[{"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.442 186844 WARNING nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.450 186844 DEBUG nova.virt.libvirt.host [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.451 186844 DEBUG nova.virt.libvirt.host [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.461 186844 DEBUG nova.virt.libvirt.host [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.462 186844 DEBUG nova.virt.libvirt.host [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.463 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.463 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.464 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.465 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.465 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.466 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.466 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.467 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.467 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.468 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.468 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.469 186844 DEBUG nova.virt.hardware [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.475 186844 DEBUG nova.virt.libvirt.vif [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:17:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1831329056',display_name='tempest-TestNetworkBasicOps-server-1831329056',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1831329056',id=6,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIaVKLnwz32T5jDpyo7WEJrlvskZslAI5/7NGxUJivyVhVGtFkAYnU35V97Oz4Wgiv2ux6ErJ2dANrk8vgbnGnUPzSF4PSRLYk7XU+cGTBsuuaM3cDuxAsl3jR6sor7og==',key_name='tempest-TestNetworkBasicOps-1101354959',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-6lfcp046',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:17:04Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f087df93-6b03-417d-bc8b-7114adfa61a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.475 186844 DEBUG nova.network.os_vif_util [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.477 186844 DEBUG nova.network.os_vif_util [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:ca:23,bridge_name='br-int',has_traffic_filtering=True,id=ddbe59f7-465a-458f-a721-e3d5d380e6cc,network=Network(8e14168a-35d3-4dd3-9225-5c6c14ef7d52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddbe59f7-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.478 186844 DEBUG nova.objects.instance [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid f087df93-6b03-417d-bc8b-7114adfa61a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.502 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:17:12 compute-0 nova_compute[186840]:   <uuid>f087df93-6b03-417d-bc8b-7114adfa61a4</uuid>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   <name>instance-00000006</name>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1831329056</nova:name>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:17:12</nova:creationTime>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:17:12 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:17:12 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:17:12 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:17:12 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:17:12 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:17:12 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:17:12 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:17:12 compute-0 nova_compute[186840]:         <nova:port uuid="ddbe59f7-465a-458f-a721-e3d5d380e6cc">
Feb 27 17:17:12 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <system>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <entry name="serial">f087df93-6b03-417d-bc8b-7114adfa61a4</entry>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <entry name="uuid">f087df93-6b03-417d-bc8b-7114adfa61a4</entry>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     </system>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   <os>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   </os>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   <features>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   </features>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk.config"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:c8:ca:23"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <target dev="tapddbe59f7-46"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/console.log" append="off"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <video>
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     </video>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:17:12 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:17:12 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:17:12 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:17:12 compute-0 nova_compute[186840]: </domain>
Feb 27 17:17:12 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.503 186844 DEBUG nova.compute.manager [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Preparing to wait for external event network-vif-plugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.504 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.504 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.505 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.506 186844 DEBUG nova.virt.libvirt.vif [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:17:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1831329056',display_name='tempest-TestNetworkBasicOps-server-1831329056',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1831329056',id=6,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIaVKLnwz32T5jDpyo7WEJrlvskZslAI5/7NGxUJivyVhVGtFkAYnU35V97Oz4Wgiv2ux6ErJ2dANrk8vgbnGnUPzSF4PSRLYk7XU+cGTBsuuaM3cDuxAsl3jR6sor7og==',key_name='tempest-TestNetworkBasicOps-1101354959',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-6lfcp046',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:17:04Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f087df93-6b03-417d-bc8b-7114adfa61a4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.506 186844 DEBUG nova.network.os_vif_util [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.507 186844 DEBUG nova.network.os_vif_util [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c8:ca:23,bridge_name='br-int',has_traffic_filtering=True,id=ddbe59f7-465a-458f-a721-e3d5d380e6cc,network=Network(8e14168a-35d3-4dd3-9225-5c6c14ef7d52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddbe59f7-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.508 186844 DEBUG os_vif [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:ca:23,bridge_name='br-int',has_traffic_filtering=True,id=ddbe59f7-465a-458f-a721-e3d5d380e6cc,network=Network(8e14168a-35d3-4dd3-9225-5c6c14ef7d52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddbe59f7-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.508 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.509 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.509 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.513 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.513 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapddbe59f7-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.514 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapddbe59f7-46, col_values=(('external_ids', {'iface-id': 'ddbe59f7-465a-458f-a721-e3d5d380e6cc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c8:ca:23', 'vm-uuid': 'f087df93-6b03-417d-bc8b-7114adfa61a4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.517 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:12 compute-0 NetworkManager[56537]: <info>  [1772212632.5191] manager: (tapddbe59f7-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.520 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.522 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.523 186844 INFO os_vif [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c8:ca:23,bridge_name='br-int',has_traffic_filtering=True,id=ddbe59f7-465a-458f-a721-e3d5d380e6cc,network=Network(8e14168a-35d3-4dd3-9225-5c6c14ef7d52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddbe59f7-46')
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.590 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.591 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.591 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:c8:ca:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.592 186844 INFO nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Using config drive
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.695 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.758 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.760 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.761 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.761 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 27 17:17:12 compute-0 nova_compute[186840]: 2026-02-27 17:17:12.788 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.047 186844 INFO nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Creating config drive at /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk.config
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.053 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpwwsgy4qa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.182 186844 DEBUG oslo_concurrency.processutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpwwsgy4qa" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:13 compute-0 kernel: tapddbe59f7-46: entered promiscuous mode
Feb 27 17:17:13 compute-0 NetworkManager[56537]: <info>  [1772212633.2353] manager: (tapddbe59f7-46): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Feb 27 17:17:13 compute-0 ovn_controller[96756]: 2026-02-27T17:17:13Z|00082|binding|INFO|Claiming lport ddbe59f7-465a-458f-a721-e3d5d380e6cc for this chassis.
Feb 27 17:17:13 compute-0 ovn_controller[96756]: 2026-02-27T17:17:13Z|00083|binding|INFO|ddbe59f7-465a-458f-a721-e3d5d380e6cc: Claiming fa:16:3e:c8:ca:23 10.100.0.6
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.237 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.240 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.243 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.249 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.271 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:ca:23 10.100.0.6'], port_security=['fa:16:3e:c8:ca:23 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e14168a-35d3-4dd3-9225-5c6c14ef7d52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4694fc97-3ead-4f0e-a0fa-02f879d98eb1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53a09580-e670-430e-8e67-1c6e90b35016, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=ddbe59f7-465a-458f-a721-e3d5d380e6cc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:17:13 compute-0 systemd-udevd[218160]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.273 106085 INFO neutron.agent.ovn.metadata.agent [-] Port ddbe59f7-465a-458f-a721-e3d5d380e6cc in datapath 8e14168a-35d3-4dd3-9225-5c6c14ef7d52 bound to our chassis
Feb 27 17:17:13 compute-0 systemd-machined[156136]: New machine qemu-6-instance-00000006.
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.275 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8e14168a-35d3-4dd3-9225-5c6c14ef7d52
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.283 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[162ffa9f-a775-43c5-8a36-fd99acedd463]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.284 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8e14168a-31 in ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.286 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8e14168a-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.286 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[9d11226e-2040-42bc-a3ba-f010bf9caa4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.287 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[75d41896-5ecf-44c5-892c-bed73b4d8e5e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 NetworkManager[56537]: <info>  [1772212633.2885] device (tapddbe59f7-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:17:13 compute-0 NetworkManager[56537]: <info>  [1772212633.2891] device (tapddbe59f7-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:17:13 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000006.
Feb 27 17:17:13 compute-0 ovn_controller[96756]: 2026-02-27T17:17:13Z|00084|binding|INFO|Setting lport ddbe59f7-465a-458f-a721-e3d5d380e6cc ovn-installed in OVS
Feb 27 17:17:13 compute-0 ovn_controller[96756]: 2026-02-27T17:17:13Z|00085|binding|INFO|Setting lport ddbe59f7-465a-458f-a721-e3d5d380e6cc up in Southbound
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.293 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.296 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[1a050d99-1a01-4391-a4f9-64a2abe27e98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.316 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[fa1fba16-51f3-4db7-ad35-bbd390fc97b4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.340 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[395c2ff0-7b8e-4ca4-bd51-919feccb1c70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 systemd-udevd[218163]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.344 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[19520038-e394-4bf5-8b86-6bc6cba2e782]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 NetworkManager[56537]: <info>  [1772212633.3456] manager: (tap8e14168a-30): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.371 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[d29d84b7-5182-4a9d-b9a7-a7777c74a379]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.375 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[020c3942-65c0-418c-9b8f-807d73d62534]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 NetworkManager[56537]: <info>  [1772212633.3931] device (tap8e14168a-30): carrier: link connected
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.395 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[933cd579-3d12-476f-8dcd-8cb1a30bc3de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.406 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f34e25-842d-4e5c-aa28-c1b3fd5f0618]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e14168a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:4c:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350888, 'reachable_time': 42225, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218193, 'error': None, 'target': 'ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.417 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[afcbd308-23cc-4b93-80dd-fed6e3b40ba7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe63:4c0f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 350888, 'tstamp': 350888}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218194, 'error': None, 'target': 'ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.428 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[25a1d23e-17dd-4f48-8bd3-75438bb502c7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8e14168a-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:63:4c:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350888, 'reachable_time': 42225, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218195, 'error': None, 'target': 'ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.451 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[f3d82eca-7966-4e4c-9537-2a13dc1a36b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.488 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[417368dc-1740-445b-9431-6f5a025ef0ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.489 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e14168a-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.489 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.490 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8e14168a-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.492 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:13 compute-0 kernel: tap8e14168a-30: entered promiscuous mode
Feb 27 17:17:13 compute-0 NetworkManager[56537]: <info>  [1772212633.4931] manager: (tap8e14168a-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.494 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8e14168a-30, col_values=(('external_ids', {'iface-id': '7974a05e-8c1c-4bb6-b06c-d51011d44f74'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:13 compute-0 ovn_controller[96756]: 2026-02-27T17:17:13Z|00086|binding|INFO|Releasing lport 7974a05e-8c1c-4bb6-b06c-d51011d44f74 from this chassis (sb_readonly=0)
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.503 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.504 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8e14168a-35d3-4dd3-9225-5c6c14ef7d52.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8e14168a-35d3-4dd3-9225-5c6c14ef7d52.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.504 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[ab8eec67-f725-4af8-8d00-3dac85fda875]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.505 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-8e14168a-35d3-4dd3-9225-5c6c14ef7d52
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/8e14168a-35d3-4dd3-9225-5c6c14ef7d52.pid.haproxy
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID 8e14168a-35d3-4dd3-9225-5c6c14ef7d52
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:17:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:13.506 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52', 'env', 'PROCESS_TAG=haproxy-8e14168a-35d3-4dd3-9225-5c6c14ef7d52', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8e14168a-35d3-4dd3-9225-5c6c14ef7d52.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.720 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.742 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.743 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.744 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.744 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.803 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:13 compute-0 podman[218227]: 2026-02-27 17:17:13.827046521 +0000 UTC m=+0.054463670 container create 4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.846 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.847 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:13 compute-0 systemd[1]: Started libpod-conmon-4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99.scope.
Feb 27 17:17:13 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:17:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/62c9773153dba8e633a74dd8943e14111a074a4c0019933a5ed1cc7c229259e6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:17:13 compute-0 podman[218227]: 2026-02-27 17:17:13.791384701 +0000 UTC m=+0.018801880 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:17:13 compute-0 podman[218227]: 2026-02-27 17:17:13.896659088 +0000 UTC m=+0.124076237 container init 4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, io.buildah.version=1.43.0)
Feb 27 17:17:13 compute-0 podman[218227]: 2026-02-27 17:17:13.900189416 +0000 UTC m=+0.127606565 container start 4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, io.buildah.version=1.43.0)
Feb 27 17:17:13 compute-0 neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52[218251]: [NOTICE]   (218257) : New worker (218259) forked
Feb 27 17:17:13 compute-0 neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52[218251]: [NOTICE]   (218257) : Loading success.
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.922 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.948 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212633.948084, f087df93-6b03-417d-bc8b-7114adfa61a4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.949 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] VM Started (Lifecycle Event)
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.954 186844 DEBUG nova.compute.manager [req-654005a8-eb62-462e-90ab-6405de560a88 req-e473f9e4-9df3-4448-b2ee-a5f985e736f1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-vif-plugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.954 186844 DEBUG oslo_concurrency.lockutils [req-654005a8-eb62-462e-90ab-6405de560a88 req-e473f9e4-9df3-4448-b2ee-a5f985e736f1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.954 186844 DEBUG oslo_concurrency.lockutils [req-654005a8-eb62-462e-90ab-6405de560a88 req-e473f9e4-9df3-4448-b2ee-a5f985e736f1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.955 186844 DEBUG oslo_concurrency.lockutils [req-654005a8-eb62-462e-90ab-6405de560a88 req-e473f9e4-9df3-4448-b2ee-a5f985e736f1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.955 186844 DEBUG nova.compute.manager [req-654005a8-eb62-462e-90ab-6405de560a88 req-e473f9e4-9df3-4448-b2ee-a5f985e736f1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Processing event network-vif-plugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.955 186844 DEBUG nova.compute.manager [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.959 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.962 186844 INFO nova.virt.libvirt.driver [-] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Instance spawned successfully.
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.962 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.969 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.971 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.979 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.980 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.980 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.981 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.981 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.981 186844 DEBUG nova.virt.libvirt.driver [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.990 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.990 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212633.951527, f087df93-6b03-417d-bc8b-7114adfa61a4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:17:13 compute-0 nova_compute[186840]: 2026-02-27 17:17:13.990 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] VM Paused (Lifecycle Event)
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.010 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.013 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212633.9586043, f087df93-6b03-417d-bc8b-7114adfa61a4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.013 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] VM Resumed (Lifecycle Event)
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.040 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.042 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.052 186844 INFO nova.compute.manager [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Took 9.35 seconds to spawn the instance on the hypervisor.
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.052 186844 DEBUG nova.compute.manager [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.061 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.118 186844 INFO nova.compute.manager [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Took 9.94 seconds to build instance.
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.133 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.135 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5708MB free_disk=73.1933479309082GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.136 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.136 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.138 186844 DEBUG oslo_concurrency.lockutils [None req-31a775de-b088-4cb8-a8fa-3c4c9483ea19 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.144 186844 DEBUG nova.network.neutron [req-c75180ae-9a5c-4873-be81-ed99f4ce3c7d req-768ab782-ed2c-4a48-bf16-20e3ec04192d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updated VIF entry in instance network info cache for port ddbe59f7-465a-458f-a721-e3d5d380e6cc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.145 186844 DEBUG nova.network.neutron [req-c75180ae-9a5c-4873-be81-ed99f4ce3c7d req-768ab782-ed2c-4a48-bf16-20e3ec04192d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updating instance_info_cache with network_info: [{"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.153 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:14.154 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:17:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:14.155 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.176 186844 DEBUG oslo_concurrency.lockutils [req-c75180ae-9a5c-4873-be81-ed99f4ce3c7d req-768ab782-ed2c-4a48-bf16-20e3ec04192d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.276 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Instance f087df93-6b03-417d-bc8b-7114adfa61a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.276 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.276 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.394 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.410 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.444 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.444 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.308s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.445 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:14 compute-0 nova_compute[186840]: 2026-02-27 17:17:14.445 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 27 17:17:15 compute-0 nova_compute[186840]: 2026-02-27 17:17:15.442 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:15 compute-0 nova_compute[186840]: 2026-02-27 17:17:15.497 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:16 compute-0 nova_compute[186840]: 2026-02-27 17:17:16.060 186844 DEBUG nova.compute.manager [req-d3d0ee6e-2090-4ffb-816f-ea806f7c2ad2 req-9a0864ee-598d-42a9-a7e0-b1953683ad36 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-vif-plugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:17:16 compute-0 nova_compute[186840]: 2026-02-27 17:17:16.061 186844 DEBUG oslo_concurrency.lockutils [req-d3d0ee6e-2090-4ffb-816f-ea806f7c2ad2 req-9a0864ee-598d-42a9-a7e0-b1953683ad36 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:16 compute-0 nova_compute[186840]: 2026-02-27 17:17:16.061 186844 DEBUG oslo_concurrency.lockutils [req-d3d0ee6e-2090-4ffb-816f-ea806f7c2ad2 req-9a0864ee-598d-42a9-a7e0-b1953683ad36 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:16 compute-0 nova_compute[186840]: 2026-02-27 17:17:16.062 186844 DEBUG oslo_concurrency.lockutils [req-d3d0ee6e-2090-4ffb-816f-ea806f7c2ad2 req-9a0864ee-598d-42a9-a7e0-b1953683ad36 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:16 compute-0 nova_compute[186840]: 2026-02-27 17:17:16.062 186844 DEBUG nova.compute.manager [req-d3d0ee6e-2090-4ffb-816f-ea806f7c2ad2 req-9a0864ee-598d-42a9-a7e0-b1953683ad36 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] No waiting events found dispatching network-vif-plugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:17:16 compute-0 nova_compute[186840]: 2026-02-27 17:17:16.063 186844 WARNING nova.compute.manager [req-d3d0ee6e-2090-4ffb-816f-ea806f7c2ad2 req-9a0864ee-598d-42a9-a7e0-b1953683ad36 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received unexpected event network-vif-plugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc for instance with vm_state active and task_state None.
Feb 27 17:17:16 compute-0 ovn_controller[96756]: 2026-02-27T17:17:16Z|00087|binding|INFO|Releasing lport 7974a05e-8c1c-4bb6-b06c-d51011d44f74 from this chassis (sb_readonly=0)
Feb 27 17:17:16 compute-0 nova_compute[186840]: 2026-02-27 17:17:16.314 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:16 compute-0 NetworkManager[56537]: <info>  [1772212636.3153] manager: (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Feb 27 17:17:16 compute-0 NetworkManager[56537]: <info>  [1772212636.3161] manager: (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Feb 27 17:17:16 compute-0 ovn_controller[96756]: 2026-02-27T17:17:16Z|00088|binding|INFO|Releasing lport 7974a05e-8c1c-4bb6-b06c-d51011d44f74 from this chassis (sb_readonly=0)
Feb 27 17:17:16 compute-0 nova_compute[186840]: 2026-02-27 17:17:16.324 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:16 compute-0 nova_compute[186840]: 2026-02-27 17:17:16.329 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:16 compute-0 nova_compute[186840]: 2026-02-27 17:17:16.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:16 compute-0 nova_compute[186840]: 2026-02-27 17:17:16.698 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:17:17 compute-0 nova_compute[186840]: 2026-02-27 17:17:17.545 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:17 compute-0 nova_compute[186840]: 2026-02-27 17:17:17.702 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:18 compute-0 nova_compute[186840]: 2026-02-27 17:17:18.912 186844 DEBUG nova.compute.manager [req-58992e2c-a868-41f1-b909-8603a502f2a6 req-ba67e959-b3f6-4821-b696-21d350806e13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-changed-ddbe59f7-465a-458f-a721-e3d5d380e6cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:17:18 compute-0 nova_compute[186840]: 2026-02-27 17:17:18.913 186844 DEBUG nova.compute.manager [req-58992e2c-a868-41f1-b909-8603a502f2a6 req-ba67e959-b3f6-4821-b696-21d350806e13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Refreshing instance network info cache due to event network-changed-ddbe59f7-465a-458f-a721-e3d5d380e6cc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:17:18 compute-0 nova_compute[186840]: 2026-02-27 17:17:18.913 186844 DEBUG oslo_concurrency.lockutils [req-58992e2c-a868-41f1-b909-8603a502f2a6 req-ba67e959-b3f6-4821-b696-21d350806e13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:17:18 compute-0 nova_compute[186840]: 2026-02-27 17:17:18.914 186844 DEBUG oslo_concurrency.lockutils [req-58992e2c-a868-41f1-b909-8603a502f2a6 req-ba67e959-b3f6-4821-b696-21d350806e13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:17:18 compute-0 nova_compute[186840]: 2026-02-27 17:17:18.914 186844 DEBUG nova.network.neutron [req-58992e2c-a868-41f1-b909-8603a502f2a6 req-ba67e959-b3f6-4821-b696-21d350806e13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Refreshing network info cache for port ddbe59f7-465a-458f-a721-e3d5d380e6cc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:17:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:19.157 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:19 compute-0 podman[218272]: 2026-02-27 17:17:19.703311954 +0000 UTC m=+0.102338165 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260223)
Feb 27 17:17:20 compute-0 nova_compute[186840]: 2026-02-27 17:17:20.499 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:21 compute-0 nova_compute[186840]: 2026-02-27 17:17:21.724 186844 DEBUG nova.network.neutron [req-58992e2c-a868-41f1-b909-8603a502f2a6 req-ba67e959-b3f6-4821-b696-21d350806e13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updated VIF entry in instance network info cache for port ddbe59f7-465a-458f-a721-e3d5d380e6cc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:17:21 compute-0 nova_compute[186840]: 2026-02-27 17:17:21.725 186844 DEBUG nova.network.neutron [req-58992e2c-a868-41f1-b909-8603a502f2a6 req-ba67e959-b3f6-4821-b696-21d350806e13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updating instance_info_cache with network_info: [{"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:17:21 compute-0 nova_compute[186840]: 2026-02-27 17:17:21.755 186844 DEBUG oslo_concurrency.lockutils [req-58992e2c-a868-41f1-b909-8603a502f2a6 req-ba67e959-b3f6-4821-b696-21d350806e13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:17:22 compute-0 nova_compute[186840]: 2026-02-27 17:17:22.597 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:24 compute-0 podman[218314]: 2026-02-27 17:17:24.672482892 +0000 UTC m=+0.073907595 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:17:24 compute-0 ovn_controller[96756]: 2026-02-27T17:17:24Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c8:ca:23 10.100.0.6
Feb 27 17:17:24 compute-0 ovn_controller[96756]: 2026-02-27T17:17:24Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c8:ca:23 10.100.0.6
Feb 27 17:17:25 compute-0 nova_compute[186840]: 2026-02-27 17:17:25.507 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:27 compute-0 nova_compute[186840]: 2026-02-27 17:17:27.600 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:30 compute-0 nova_compute[186840]: 2026-02-27 17:17:30.510 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:32 compute-0 nova_compute[186840]: 2026-02-27 17:17:32.174 186844 INFO nova.compute.manager [None req-7c88878c-eb6c-48df-9c4e-58bdb9ed00a9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Get console output
Feb 27 17:17:32 compute-0 nova_compute[186840]: 2026-02-27 17:17:32.184 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:17:32 compute-0 nova_compute[186840]: 2026-02-27 17:17:32.654 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:35 compute-0 nova_compute[186840]: 2026-02-27 17:17:35.186 186844 DEBUG oslo_concurrency.lockutils [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "interface-f087df93-6b03-417d-bc8b-7114adfa61a4-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:35 compute-0 nova_compute[186840]: 2026-02-27 17:17:35.187 186844 DEBUG oslo_concurrency.lockutils [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "interface-f087df93-6b03-417d-bc8b-7114adfa61a4-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:35 compute-0 nova_compute[186840]: 2026-02-27 17:17:35.188 186844 DEBUG nova.objects.instance [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'flavor' on Instance uuid f087df93-6b03-417d-bc8b-7114adfa61a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:17:35 compute-0 nova_compute[186840]: 2026-02-27 17:17:35.512 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:35 compute-0 nova_compute[186840]: 2026-02-27 17:17:35.565 186844 DEBUG nova.objects.instance [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_requests' on Instance uuid f087df93-6b03-417d-bc8b-7114adfa61a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:17:35 compute-0 nova_compute[186840]: 2026-02-27 17:17:35.580 186844 DEBUG nova.network.neutron [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:17:36 compute-0 nova_compute[186840]: 2026-02-27 17:17:36.522 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:17:36 compute-0 nova_compute[186840]: 2026-02-27 17:17:36.541 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Triggering sync for uuid f087df93-6b03-417d-bc8b-7114adfa61a4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 27 17:17:36 compute-0 nova_compute[186840]: 2026-02-27 17:17:36.543 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:36 compute-0 nova_compute[186840]: 2026-02-27 17:17:36.544 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:36 compute-0 nova_compute[186840]: 2026-02-27 17:17:36.577 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.034s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:36 compute-0 podman[218339]: 2026-02-27 17:17:36.67540139 +0000 UTC m=+0.078911151 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:17:36 compute-0 nova_compute[186840]: 2026-02-27 17:17:36.709 186844 DEBUG nova.policy [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:17:37 compute-0 nova_compute[186840]: 2026-02-27 17:17:37.658 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:38 compute-0 podman[218364]: 2026-02-27 17:17:38.675321218 +0000 UTC m=+0.076481399 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 27 17:17:38 compute-0 nova_compute[186840]: 2026-02-27 17:17:38.858 186844 DEBUG nova.network.neutron [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Successfully created port: 3e6806d8-8dee-4392-befe-ef55f59117ce _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:17:40 compute-0 nova_compute[186840]: 2026-02-27 17:17:40.054 186844 DEBUG nova.network.neutron [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Successfully updated port: 3e6806d8-8dee-4392-befe-ef55f59117ce _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:17:40 compute-0 nova_compute[186840]: 2026-02-27 17:17:40.074 186844 DEBUG oslo_concurrency.lockutils [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:17:40 compute-0 nova_compute[186840]: 2026-02-27 17:17:40.074 186844 DEBUG oslo_concurrency.lockutils [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:17:40 compute-0 nova_compute[186840]: 2026-02-27 17:17:40.074 186844 DEBUG nova.network.neutron [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:17:40 compute-0 nova_compute[186840]: 2026-02-27 17:17:40.167 186844 DEBUG nova.compute.manager [req-54944689-b2ec-488f-8a24-674514376229 req-c68a3e9b-a186-49bc-b2da-4edadf37cf24 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-changed-3e6806d8-8dee-4392-befe-ef55f59117ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:17:40 compute-0 nova_compute[186840]: 2026-02-27 17:17:40.167 186844 DEBUG nova.compute.manager [req-54944689-b2ec-488f-8a24-674514376229 req-c68a3e9b-a186-49bc-b2da-4edadf37cf24 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Refreshing instance network info cache due to event network-changed-3e6806d8-8dee-4392-befe-ef55f59117ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:17:40 compute-0 nova_compute[186840]: 2026-02-27 17:17:40.167 186844 DEBUG oslo_concurrency.lockutils [req-54944689-b2ec-488f-8a24-674514376229 req-c68a3e9b-a186-49bc-b2da-4edadf37cf24 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:17:40 compute-0 nova_compute[186840]: 2026-02-27 17:17:40.515 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.590 186844 DEBUG nova.network.neutron [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updating instance_info_cache with network_info: [{"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.617 186844 DEBUG oslo_concurrency.lockutils [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.618 186844 DEBUG oslo_concurrency.lockutils [req-54944689-b2ec-488f-8a24-674514376229 req-c68a3e9b-a186-49bc-b2da-4edadf37cf24 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.619 186844 DEBUG nova.network.neutron [req-54944689-b2ec-488f-8a24-674514376229 req-c68a3e9b-a186-49bc-b2da-4edadf37cf24 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Refreshing network info cache for port 3e6806d8-8dee-4392-befe-ef55f59117ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.624 186844 DEBUG nova.virt.libvirt.vif [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:17:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1831329056',display_name='tempest-TestNetworkBasicOps-server-1831329056',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1831329056',id=6,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIaVKLnwz32T5jDpyo7WEJrlvskZslAI5/7NGxUJivyVhVGtFkAYnU35V97Oz4Wgiv2ux6ErJ2dANrk8vgbnGnUPzSF4PSRLYk7XU+cGTBsuuaM3cDuxAsl3jR6sor7og==',key_name='tempest-TestNetworkBasicOps-1101354959',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:17:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-6lfcp046',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:17:14Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f087df93-6b03-417d-bc8b-7114adfa61a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.625 186844 DEBUG nova.network.os_vif_util [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.626 186844 DEBUG nova.network.os_vif_util [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.626 186844 DEBUG os_vif [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.628 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.629 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.629 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.640 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.640 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e6806d8-8d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.641 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e6806d8-8d, col_values=(('external_ids', {'iface-id': '3e6806d8-8dee-4392-befe-ef55f59117ce', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:63:c2:e1', 'vm-uuid': 'f087df93-6b03-417d-bc8b-7114adfa61a4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.644 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:42 compute-0 NetworkManager[56537]: <info>  [1772212662.6464] manager: (tap3e6806d8-8d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.648 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.653 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.654 186844 INFO os_vif [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d')
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.655 186844 DEBUG nova.virt.libvirt.vif [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:17:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1831329056',display_name='tempest-TestNetworkBasicOps-server-1831329056',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1831329056',id=6,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIaVKLnwz32T5jDpyo7WEJrlvskZslAI5/7NGxUJivyVhVGtFkAYnU35V97Oz4Wgiv2ux6ErJ2dANrk8vgbnGnUPzSF4PSRLYk7XU+cGTBsuuaM3cDuxAsl3jR6sor7og==',key_name='tempest-TestNetworkBasicOps-1101354959',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:17:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-6lfcp046',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:17:14Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f087df93-6b03-417d-bc8b-7114adfa61a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.655 186844 DEBUG nova.network.os_vif_util [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.656 186844 DEBUG nova.network.os_vif_util [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.658 186844 DEBUG nova.virt.libvirt.guest [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] attach device xml: <interface type="ethernet">
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <mac address="fa:16:3e:63:c2:e1"/>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <model type="virtio"/>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <mtu size="1442"/>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <target dev="tap3e6806d8-8d"/>
Feb 27 17:17:42 compute-0 nova_compute[186840]: </interface>
Feb 27 17:17:42 compute-0 nova_compute[186840]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Feb 27 17:17:42 compute-0 podman[218383]: 2026-02-27 17:17:42.665372932 +0000 UTC m=+0.067646479 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, release=1770267347, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Feb 27 17:17:42 compute-0 kernel: tap3e6806d8-8d: entered promiscuous mode
Feb 27 17:17:42 compute-0 NetworkManager[56537]: <info>  [1772212662.6698] manager: (tap3e6806d8-8d): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Feb 27 17:17:42 compute-0 ovn_controller[96756]: 2026-02-27T17:17:42Z|00089|binding|INFO|Claiming lport 3e6806d8-8dee-4392-befe-ef55f59117ce for this chassis.
Feb 27 17:17:42 compute-0 ovn_controller[96756]: 2026-02-27T17:17:42Z|00090|binding|INFO|3e6806d8-8dee-4392-befe-ef55f59117ce: Claiming fa:16:3e:63:c2:e1 10.100.0.19
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.672 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.683 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:c2:e1 10.100.0.19'], port_security=['fa:16:3e:63:c2:e1 10.100.0.19'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-def28598-31df-42ab-92d1-b43c240c6127', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c7cddf1d-a2b6-4e60-80f6-1d5a07563c5c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc893a01-f8c4-4bb0-8d1a-3097917beb15, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=3e6806d8-8dee-4392-befe-ef55f59117ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.686 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 3e6806d8-8dee-4392-befe-ef55f59117ce in datapath def28598-31df-42ab-92d1-b43c240c6127 bound to our chassis
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.687 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network def28598-31df-42ab-92d1-b43c240c6127
Feb 27 17:17:42 compute-0 podman[218384]: 2026-02-27 17:17:42.696670473 +0000 UTC m=+0.090042718 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.697 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.696 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[812fdf58-2605-4b47-a621-d759439ae606]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.697 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdef28598-31 in ovnmeta-def28598-31df-42ab-92d1-b43c240c6127 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:17:42 compute-0 systemd-udevd[218436]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:17:42 compute-0 ovn_controller[96756]: 2026-02-27T17:17:42Z|00091|binding|INFO|Setting lport 3e6806d8-8dee-4392-befe-ef55f59117ce ovn-installed in OVS
Feb 27 17:17:42 compute-0 ovn_controller[96756]: 2026-02-27T17:17:42Z|00092|binding|INFO|Setting lport 3e6806d8-8dee-4392-befe-ef55f59117ce up in Southbound
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.701 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.698 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdef28598-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.698 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[02624df7-62f6-4820-b31f-a32307a699ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.701 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[d66840f1-6376-4b5a-9231-ea6fb5e7e547]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 NetworkManager[56537]: <info>  [1772212662.7100] device (tap3e6806d8-8d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:17:42 compute-0 NetworkManager[56537]: <info>  [1772212662.7108] device (tap3e6806d8-8d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.709 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[b86cb137-0a69-4c0d-b608-a93d0c423dd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.723 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c55cc1-9e6e-44f9-9d09-3c22a7855605]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.747 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[4dbbd190-a057-4bc3-b4ed-5543f44d7825]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 systemd-udevd[218440]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:17:42 compute-0 NetworkManager[56537]: <info>  [1772212662.7531] manager: (tapdef28598-30): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.752 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[bb8a9b0b-d9eb-45a5-b654-3919f5525855]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.776 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[e05e4bc7-d78c-4ce6-8453-c42776dfe19e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.779 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[95fe1393-1b44-4063-81fb-52da290b4787]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 NetworkManager[56537]: <info>  [1772212662.7947] device (tapdef28598-30): carrier: link connected
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.799 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[349f8a37-1695-4f50-a969-6c00f23a0eaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.814 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[9231b4a5-5d9b-4e35-8d5f-9f5c63802d14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdef28598-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:03:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353828, 'reachable_time': 16604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218463, 'error': None, 'target': 'ovnmeta-def28598-31df-42ab-92d1-b43c240c6127', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.828 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[5de11f5c-a806-4640-941e-b29d5ecdf80a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:3ff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353828, 'tstamp': 353828}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218464, 'error': None, 'target': 'ovnmeta-def28598-31df-42ab-92d1-b43c240c6127', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.844 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3b907777-bc2b-4513-ace3-a81fc46fc5a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdef28598-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:03:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353828, 'reachable_time': 16604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218465, 'error': None, 'target': 'ovnmeta-def28598-31df-42ab-92d1-b43c240c6127', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.868 186844 DEBUG nova.virt.libvirt.driver [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.868 186844 DEBUG nova.virt.libvirt.driver [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.869 186844 DEBUG nova.virt.libvirt.driver [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:c8:ca:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.869 186844 DEBUG nova.virt.libvirt.driver [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:63:c2:e1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.871 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[ece208d6-d81a-44d6-9ef5-6f6d2770026b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.905 186844 DEBUG nova.virt.libvirt.guest [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1831329056</nova:name>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:17:42</nova:creationTime>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:17:42 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:17:42 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:17:42 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:17:42 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:17:42 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:17:42 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:17:42 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:17:42 compute-0 nova_compute[186840]:     <nova:port uuid="ddbe59f7-465a-458f-a721-e3d5d380e6cc">
Feb 27 17:17:42 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 27 17:17:42 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:17:42 compute-0 nova_compute[186840]:     <nova:port uuid="3e6806d8-8dee-4392-befe-ef55f59117ce">
Feb 27 17:17:42 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Feb 27 17:17:42 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:17:42 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:17:42 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:17:42 compute-0 nova_compute[186840]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.911 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[9d531cad-1043-46d5-b726-2281226bb6e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.912 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdef28598-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.912 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.912 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdef28598-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:42 compute-0 NetworkManager[56537]: <info>  [1772212662.9146] manager: (tapdef28598-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.914 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:42 compute-0 kernel: tapdef28598-30: entered promiscuous mode
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.917 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdef28598-30, col_values=(('external_ids', {'iface-id': '35120014-f89b-47df-b6c5-52840b05e2b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:42 compute-0 ovn_controller[96756]: 2026-02-27T17:17:42Z|00093|binding|INFO|Releasing lport 35120014-f89b-47df-b6c5-52840b05e2b5 from this chassis (sb_readonly=0)
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.918 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.925 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.926 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/def28598-31df-42ab-92d1-b43c240c6127.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/def28598-31df-42ab-92d1-b43c240c6127.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.926 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[fe1b3fd0-ee85-4eff-98e8-b07c4077eec7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.927 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-def28598-31df-42ab-92d1-b43c240c6127
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/def28598-31df-42ab-92d1-b43c240c6127.pid.haproxy
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID def28598-31df-42ab-92d1-b43c240c6127
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:17:42 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:42.927 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-def28598-31df-42ab-92d1-b43c240c6127', 'env', 'PROCESS_TAG=haproxy-def28598-31df-42ab-92d1-b43c240c6127', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/def28598-31df-42ab-92d1-b43c240c6127.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:17:42 compute-0 nova_compute[186840]: 2026-02-27 17:17:42.941 186844 DEBUG oslo_concurrency.lockutils [None req-bf29fab7-ccc3-4be2-83b5-3eda3c976f0c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "interface-f087df93-6b03-417d-bc8b-7114adfa61a4-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:43 compute-0 nova_compute[186840]: 2026-02-27 17:17:43.001 186844 DEBUG nova.compute.manager [req-4aa3c325-a8d8-464a-b732-35e6119bc59e req-ef751504-d2c0-41c9-b332-f5481946c65b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-vif-plugged-3e6806d8-8dee-4392-befe-ef55f59117ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:17:43 compute-0 nova_compute[186840]: 2026-02-27 17:17:43.001 186844 DEBUG oslo_concurrency.lockutils [req-4aa3c325-a8d8-464a-b732-35e6119bc59e req-ef751504-d2c0-41c9-b332-f5481946c65b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:43 compute-0 nova_compute[186840]: 2026-02-27 17:17:43.001 186844 DEBUG oslo_concurrency.lockutils [req-4aa3c325-a8d8-464a-b732-35e6119bc59e req-ef751504-d2c0-41c9-b332-f5481946c65b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:43 compute-0 nova_compute[186840]: 2026-02-27 17:17:43.002 186844 DEBUG oslo_concurrency.lockutils [req-4aa3c325-a8d8-464a-b732-35e6119bc59e req-ef751504-d2c0-41c9-b332-f5481946c65b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:43 compute-0 nova_compute[186840]: 2026-02-27 17:17:43.002 186844 DEBUG nova.compute.manager [req-4aa3c325-a8d8-464a-b732-35e6119bc59e req-ef751504-d2c0-41c9-b332-f5481946c65b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] No waiting events found dispatching network-vif-plugged-3e6806d8-8dee-4392-befe-ef55f59117ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:17:43 compute-0 nova_compute[186840]: 2026-02-27 17:17:43.002 186844 WARNING nova.compute.manager [req-4aa3c325-a8d8-464a-b732-35e6119bc59e req-ef751504-d2c0-41c9-b332-f5481946c65b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received unexpected event network-vif-plugged-3e6806d8-8dee-4392-befe-ef55f59117ce for instance with vm_state active and task_state None.
Feb 27 17:17:43 compute-0 podman[218496]: 2026-02-27 17:17:43.220656319 +0000 UTC m=+0.031146458 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:17:44 compute-0 nova_compute[186840]: 2026-02-27 17:17:44.054 186844 DEBUG nova.network.neutron [req-54944689-b2ec-488f-8a24-674514376229 req-c68a3e9b-a186-49bc-b2da-4edadf37cf24 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updated VIF entry in instance network info cache for port 3e6806d8-8dee-4392-befe-ef55f59117ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:17:44 compute-0 nova_compute[186840]: 2026-02-27 17:17:44.055 186844 DEBUG nova.network.neutron [req-54944689-b2ec-488f-8a24-674514376229 req-c68a3e9b-a186-49bc-b2da-4edadf37cf24 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updating instance_info_cache with network_info: [{"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:17:44 compute-0 nova_compute[186840]: 2026-02-27 17:17:44.081 186844 DEBUG oslo_concurrency.lockutils [req-54944689-b2ec-488f-8a24-674514376229 req-c68a3e9b-a186-49bc-b2da-4edadf37cf24 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:17:44 compute-0 podman[218496]: 2026-02-27 17:17:44.343911691 +0000 UTC m=+1.154401790 container create 4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 27 17:17:44 compute-0 systemd[1]: Started libpod-conmon-4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359.scope.
Feb 27 17:17:44 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:17:44 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1b06f2b36ded53f41876e57001154054c4a9affbbcc39ecf2a2e76c9ceaad093/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:17:44 compute-0 podman[218496]: 2026-02-27 17:17:44.7121909 +0000 UTC m=+1.522680969 container init 4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:17:44 compute-0 podman[218496]: 2026-02-27 17:17:44.717115043 +0000 UTC m=+1.527605102 container start 4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.43.0)
Feb 27 17:17:44 compute-0 neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127[218512]: [NOTICE]   (218516) : New worker (218518) forked
Feb 27 17:17:44 compute-0 neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127[218512]: [NOTICE]   (218516) : Loading success.
Feb 27 17:17:45 compute-0 nova_compute[186840]: 2026-02-27 17:17:45.117 186844 DEBUG nova.compute.manager [req-ab515d2a-aedb-4f68-854d-c3382872cf89 req-fb283c6d-073d-4d86-9c56-4d4672677571 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-vif-plugged-3e6806d8-8dee-4392-befe-ef55f59117ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:17:45 compute-0 nova_compute[186840]: 2026-02-27 17:17:45.117 186844 DEBUG oslo_concurrency.lockutils [req-ab515d2a-aedb-4f68-854d-c3382872cf89 req-fb283c6d-073d-4d86-9c56-4d4672677571 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:45 compute-0 nova_compute[186840]: 2026-02-27 17:17:45.118 186844 DEBUG oslo_concurrency.lockutils [req-ab515d2a-aedb-4f68-854d-c3382872cf89 req-fb283c6d-073d-4d86-9c56-4d4672677571 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:45 compute-0 nova_compute[186840]: 2026-02-27 17:17:45.118 186844 DEBUG oslo_concurrency.lockutils [req-ab515d2a-aedb-4f68-854d-c3382872cf89 req-fb283c6d-073d-4d86-9c56-4d4672677571 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:45 compute-0 nova_compute[186840]: 2026-02-27 17:17:45.119 186844 DEBUG nova.compute.manager [req-ab515d2a-aedb-4f68-854d-c3382872cf89 req-fb283c6d-073d-4d86-9c56-4d4672677571 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] No waiting events found dispatching network-vif-plugged-3e6806d8-8dee-4392-befe-ef55f59117ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:17:45 compute-0 nova_compute[186840]: 2026-02-27 17:17:45.119 186844 WARNING nova.compute.manager [req-ab515d2a-aedb-4f68-854d-c3382872cf89 req-fb283c6d-073d-4d86-9c56-4d4672677571 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received unexpected event network-vif-plugged-3e6806d8-8dee-4392-befe-ef55f59117ce for instance with vm_state active and task_state None.
Feb 27 17:17:45 compute-0 nova_compute[186840]: 2026-02-27 17:17:45.517 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:45 compute-0 ovn_controller[96756]: 2026-02-27T17:17:45Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:63:c2:e1 10.100.0.19
Feb 27 17:17:45 compute-0 ovn_controller[96756]: 2026-02-27T17:17:45Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:63:c2:e1 10.100.0.19
Feb 27 17:17:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:47.090 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:47.091 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:17:47.092 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:47 compute-0 nova_compute[186840]: 2026-02-27 17:17:47.659 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:50 compute-0 nova_compute[186840]: 2026-02-27 17:17:50.518 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:50 compute-0 podman[218528]: 2026-02-27 17:17:50.652928235 +0000 UTC m=+0.056692896 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Feb 27 17:17:52 compute-0 nova_compute[186840]: 2026-02-27 17:17:52.662 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:55 compute-0 nova_compute[186840]: 2026-02-27 17:17:55.520 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:55 compute-0 podman[218548]: 2026-02-27 17:17:55.655917635 +0000 UTC m=+0.070693375 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 27 17:17:55 compute-0 nova_compute[186840]: 2026-02-27 17:17:55.704 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:55 compute-0 nova_compute[186840]: 2026-02-27 17:17:55.705 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:55 compute-0 nova_compute[186840]: 2026-02-27 17:17:55.732 186844 DEBUG nova.compute.manager [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:17:55 compute-0 nova_compute[186840]: 2026-02-27 17:17:55.844 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:55 compute-0 nova_compute[186840]: 2026-02-27 17:17:55.845 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:55 compute-0 nova_compute[186840]: 2026-02-27 17:17:55.856 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:17:55 compute-0 nova_compute[186840]: 2026-02-27 17:17:55.856 186844 INFO nova.compute.claims [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.016 186844 DEBUG nova.compute.provider_tree [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.038 186844 DEBUG nova.scheduler.client.report [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.065 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.065 186844 DEBUG nova.compute.manager [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.143 186844 DEBUG nova.compute.manager [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.143 186844 DEBUG nova.network.neutron [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.173 186844 INFO nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.192 186844 DEBUG nova.compute.manager [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.311 186844 DEBUG nova.compute.manager [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.313 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.313 186844 INFO nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Creating image(s)
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.314 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.315 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.316 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.340 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.418 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.419 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.420 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.445 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.518 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.520 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:56 compute-0 nova_compute[186840]: 2026-02-27 17:17:56.733 186844 DEBUG nova.policy [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.191 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk 1073741824" returned: 0 in 0.671s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.192 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.772s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.193 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.254 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.255 186844 DEBUG nova.virt.disk.api [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.256 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.331 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.333 186844 DEBUG nova.virt.disk.api [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.334 186844 DEBUG nova.objects.instance [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid 1c46d320-cd4f-40ea-ba30-d030e4b745b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.354 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.354 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Ensure instance console log exists: /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.355 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.355 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.355 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.666 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:57 compute-0 nova_compute[186840]: 2026-02-27 17:17:57.729 186844 DEBUG nova.network.neutron [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Successfully created port: b4dd4d81-f47d-472f-bf94-5869a2eb2f67 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:17:58 compute-0 nova_compute[186840]: 2026-02-27 17:17:58.453 186844 DEBUG nova.network.neutron [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Successfully updated port: b4dd4d81-f47d-472f-bf94-5869a2eb2f67 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:17:58 compute-0 nova_compute[186840]: 2026-02-27 17:17:58.470 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-1c46d320-cd4f-40ea-ba30-d030e4b745b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:17:58 compute-0 nova_compute[186840]: 2026-02-27 17:17:58.470 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-1c46d320-cd4f-40ea-ba30-d030e4b745b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:17:58 compute-0 nova_compute[186840]: 2026-02-27 17:17:58.470 186844 DEBUG nova.network.neutron [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:17:58 compute-0 nova_compute[186840]: 2026-02-27 17:17:58.558 186844 DEBUG nova.compute.manager [req-4d4386ac-8de8-4d8c-9a0d-6ef98f5b4460 req-1f42115a-e1cb-4ba1-869a-3181e8dd0584 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Received event network-changed-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:17:58 compute-0 nova_compute[186840]: 2026-02-27 17:17:58.558 186844 DEBUG nova.compute.manager [req-4d4386ac-8de8-4d8c-9a0d-6ef98f5b4460 req-1f42115a-e1cb-4ba1-869a-3181e8dd0584 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Refreshing instance network info cache due to event network-changed-b4dd4d81-f47d-472f-bf94-5869a2eb2f67. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:17:58 compute-0 nova_compute[186840]: 2026-02-27 17:17:58.559 186844 DEBUG oslo_concurrency.lockutils [req-4d4386ac-8de8-4d8c-9a0d-6ef98f5b4460 req-1f42115a-e1cb-4ba1-869a-3181e8dd0584 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-1c46d320-cd4f-40ea-ba30-d030e4b745b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:17:58 compute-0 nova_compute[186840]: 2026-02-27 17:17:58.698 186844 DEBUG nova.network.neutron [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.488 186844 DEBUG nova.network.neutron [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Updating instance_info_cache with network_info: [{"id": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "address": "fa:16:3e:42:6c:83", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4dd4d81-f4", "ovs_interfaceid": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.537 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-1c46d320-cd4f-40ea-ba30-d030e4b745b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.537 186844 DEBUG nova.compute.manager [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Instance network_info: |[{"id": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "address": "fa:16:3e:42:6c:83", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4dd4d81-f4", "ovs_interfaceid": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.538 186844 DEBUG oslo_concurrency.lockutils [req-4d4386ac-8de8-4d8c-9a0d-6ef98f5b4460 req-1f42115a-e1cb-4ba1-869a-3181e8dd0584 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-1c46d320-cd4f-40ea-ba30-d030e4b745b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.538 186844 DEBUG nova.network.neutron [req-4d4386ac-8de8-4d8c-9a0d-6ef98f5b4460 req-1f42115a-e1cb-4ba1-869a-3181e8dd0584 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Refreshing network info cache for port b4dd4d81-f47d-472f-bf94-5869a2eb2f67 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.543 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Start _get_guest_xml network_info=[{"id": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "address": "fa:16:3e:42:6c:83", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4dd4d81-f4", "ovs_interfaceid": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.549 186844 WARNING nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.561 186844 DEBUG nova.virt.libvirt.host [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.562 186844 DEBUG nova.virt.libvirt.host [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.570 186844 DEBUG nova.virt.libvirt.host [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.571 186844 DEBUG nova.virt.libvirt.host [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.572 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.572 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.573 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.573 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.574 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.574 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.575 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.575 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.576 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.576 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.577 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.577 186844 DEBUG nova.virt.hardware [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.583 186844 DEBUG nova.virt.libvirt.vif [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:17:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-968808385',display_name='tempest-TestNetworkBasicOps-server-968808385',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-968808385',id=7,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLWrz8tmOcW2zvi/1U1wEEoJQxVItY9vmw9cOpNFjGqVQ0CR7tDuO9zIdatx4sFDb6uF9mxT7YB/H8L/5RbePm8Dg+nQ5GebZ8Zmxgd+vSy/lIcZmX6+EWMBMN7JWmuEw==',key_name='tempest-TestNetworkBasicOps-155566527',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-qeh0xuld',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:17:56Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=1c46d320-cd4f-40ea-ba30-d030e4b745b3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "address": "fa:16:3e:42:6c:83", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4dd4d81-f4", "ovs_interfaceid": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.583 186844 DEBUG nova.network.os_vif_util [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "address": "fa:16:3e:42:6c:83", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4dd4d81-f4", "ovs_interfaceid": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.584 186844 DEBUG nova.network.os_vif_util [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:6c:83,bridge_name='br-int',has_traffic_filtering=True,id=b4dd4d81-f47d-472f-bf94-5869a2eb2f67,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4dd4d81-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.586 186844 DEBUG nova.objects.instance [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 1c46d320-cd4f-40ea-ba30-d030e4b745b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.601 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:17:59 compute-0 nova_compute[186840]:   <uuid>1c46d320-cd4f-40ea-ba30-d030e4b745b3</uuid>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   <name>instance-00000007</name>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-968808385</nova:name>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:17:59</nova:creationTime>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:17:59 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:17:59 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:17:59 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:17:59 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:17:59 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:17:59 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:17:59 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:17:59 compute-0 nova_compute[186840]:         <nova:port uuid="b4dd4d81-f47d-472f-bf94-5869a2eb2f67">
Feb 27 17:17:59 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <system>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <entry name="serial">1c46d320-cd4f-40ea-ba30-d030e4b745b3</entry>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <entry name="uuid">1c46d320-cd4f-40ea-ba30-d030e4b745b3</entry>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     </system>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   <os>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   </os>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   <features>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   </features>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.config"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:42:6c:83"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <target dev="tapb4dd4d81-f4"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/console.log" append="off"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <video>
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     </video>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:17:59 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:17:59 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:17:59 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:17:59 compute-0 nova_compute[186840]: </domain>
Feb 27 17:17:59 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.602 186844 DEBUG nova.compute.manager [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Preparing to wait for external event network-vif-plugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.603 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.603 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.604 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.605 186844 DEBUG nova.virt.libvirt.vif [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:17:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-968808385',display_name='tempest-TestNetworkBasicOps-server-968808385',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-968808385',id=7,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLWrz8tmOcW2zvi/1U1wEEoJQxVItY9vmw9cOpNFjGqVQ0CR7tDuO9zIdatx4sFDb6uF9mxT7YB/H8L/5RbePm8Dg+nQ5GebZ8Zmxgd+vSy/lIcZmX6+EWMBMN7JWmuEw==',key_name='tempest-TestNetworkBasicOps-155566527',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-qeh0xuld',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:17:56Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=1c46d320-cd4f-40ea-ba30-d030e4b745b3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "address": "fa:16:3e:42:6c:83", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4dd4d81-f4", "ovs_interfaceid": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.605 186844 DEBUG nova.network.os_vif_util [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "address": "fa:16:3e:42:6c:83", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4dd4d81-f4", "ovs_interfaceid": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.607 186844 DEBUG nova.network.os_vif_util [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:6c:83,bridge_name='br-int',has_traffic_filtering=True,id=b4dd4d81-f47d-472f-bf94-5869a2eb2f67,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4dd4d81-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.607 186844 DEBUG os_vif [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:6c:83,bridge_name='br-int',has_traffic_filtering=True,id=b4dd4d81-f47d-472f-bf94-5869a2eb2f67,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4dd4d81-f4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.608 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.609 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.609 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.613 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.614 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb4dd4d81-f4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.615 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb4dd4d81-f4, col_values=(('external_ids', {'iface-id': 'b4dd4d81-f47d-472f-bf94-5869a2eb2f67', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:6c:83', 'vm-uuid': '1c46d320-cd4f-40ea-ba30-d030e4b745b3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.617 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:59 compute-0 NetworkManager[56537]: <info>  [1772212679.6190] manager: (tapb4dd4d81-f4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.621 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.624 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:17:59 compute-0 nova_compute[186840]: 2026-02-27 17:17:59.627 186844 INFO os_vif [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:6c:83,bridge_name='br-int',has_traffic_filtering=True,id=b4dd4d81-f47d-472f-bf94-5869a2eb2f67,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4dd4d81-f4')
Feb 27 17:18:00 compute-0 nova_compute[186840]: 2026-02-27 17:18:00.073 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:18:00 compute-0 nova_compute[186840]: 2026-02-27 17:18:00.074 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:18:00 compute-0 nova_compute[186840]: 2026-02-27 17:18:00.075 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:42:6c:83, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:18:00 compute-0 nova_compute[186840]: 2026-02-27 17:18:00.076 186844 INFO nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Using config drive
Feb 27 17:18:00 compute-0 nova_compute[186840]: 2026-02-27 17:18:00.522 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.294 186844 INFO nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Creating config drive at /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.config
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.300 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp1zis5j71 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.427 186844 DEBUG oslo_concurrency.processutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp1zis5j71" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:18:01 compute-0 kernel: tapb4dd4d81-f4: entered promiscuous mode
Feb 27 17:18:01 compute-0 NetworkManager[56537]: <info>  [1772212681.4819] manager: (tapb4dd4d81-f4): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Feb 27 17:18:01 compute-0 ovn_controller[96756]: 2026-02-27T17:18:01Z|00094|binding|INFO|Claiming lport b4dd4d81-f47d-472f-bf94-5869a2eb2f67 for this chassis.
Feb 27 17:18:01 compute-0 ovn_controller[96756]: 2026-02-27T17:18:01Z|00095|binding|INFO|b4dd4d81-f47d-472f-bf94-5869a2eb2f67: Claiming fa:16:3e:42:6c:83 10.100.0.20
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.537 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:01 compute-0 ovn_controller[96756]: 2026-02-27T17:18:01Z|00096|binding|INFO|Setting lport b4dd4d81-f47d-472f-bf94-5869a2eb2f67 ovn-installed in OVS
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.540 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.543 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:01 compute-0 systemd-udevd[218604]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:18:01 compute-0 NetworkManager[56537]: <info>  [1772212681.5682] device (tapb4dd4d81-f4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:18:01 compute-0 NetworkManager[56537]: <info>  [1772212681.5687] device (tapb4dd4d81-f4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:18:01 compute-0 ovn_controller[96756]: 2026-02-27T17:18:01Z|00097|binding|INFO|Setting lport b4dd4d81-f47d-472f-bf94-5869a2eb2f67 up in Southbound
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.587 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:6c:83 10.100.0.20'], port_security=['fa:16:3e:42:6c:83 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-def28598-31df-42ab-92d1-b43c240c6127', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8fcac549-a7b8-49db-83c5-dc41c1c04185', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc893a01-f8c4-4bb0-8d1a-3097917beb15, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=b4dd4d81-f47d-472f-bf94-5869a2eb2f67) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.589 106085 INFO neutron.agent.ovn.metadata.agent [-] Port b4dd4d81-f47d-472f-bf94-5869a2eb2f67 in datapath def28598-31df-42ab-92d1-b43c240c6127 bound to our chassis
Feb 27 17:18:01 compute-0 systemd-machined[156136]: New machine qemu-7-instance-00000007.
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.591 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network def28598-31df-42ab-92d1-b43c240c6127
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.604 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[f18f4a05-09d3-44f5-8922-0ae5884b67ea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:01 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000007.
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.631 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[0de5d007-a18d-462c-9fa6-bd8fcb603e86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.635 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[6c19a281-0d49-438c-bb10-b411454b655b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.666 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[c4250fa0-5b24-4abe-a8e0-97de69f843a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.684 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[d37018ac-94e2-4388-b516-adaa229744a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdef28598-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:03:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353828, 'reachable_time': 16604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218620, 'error': None, 'target': 'ovnmeta-def28598-31df-42ab-92d1-b43c240c6127', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.702 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab8b84d-5145-478b-aa85-1532d0cca4fc]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapdef28598-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353837, 'tstamp': 353837}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218622, 'error': None, 'target': 'ovnmeta-def28598-31df-42ab-92d1-b43c240c6127', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdef28598-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353839, 'tstamp': 353839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218622, 'error': None, 'target': 'ovnmeta-def28598-31df-42ab-92d1-b43c240c6127', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.704 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdef28598-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.706 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.708 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdef28598-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.709 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.709 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdef28598-30, col_values=(('external_ids', {'iface-id': '35120014-f89b-47df-b6c5-52840b05e2b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:01 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:01.710 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.773 186844 DEBUG nova.compute.manager [req-c23d6311-db6d-42c8-9296-81e927b06f13 req-11021a5b-c764-4a81-b6d6-ad3be0590e68 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Received event network-vif-plugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.773 186844 DEBUG oslo_concurrency.lockutils [req-c23d6311-db6d-42c8-9296-81e927b06f13 req-11021a5b-c764-4a81-b6d6-ad3be0590e68 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.774 186844 DEBUG oslo_concurrency.lockutils [req-c23d6311-db6d-42c8-9296-81e927b06f13 req-11021a5b-c764-4a81-b6d6-ad3be0590e68 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.774 186844 DEBUG oslo_concurrency.lockutils [req-c23d6311-db6d-42c8-9296-81e927b06f13 req-11021a5b-c764-4a81-b6d6-ad3be0590e68 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:01 compute-0 nova_compute[186840]: 2026-02-27 17:18:01.775 186844 DEBUG nova.compute.manager [req-c23d6311-db6d-42c8-9296-81e927b06f13 req-11021a5b-c764-4a81-b6d6-ad3be0590e68 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Processing event network-vif-plugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.141 186844 DEBUG nova.network.neutron [req-4d4386ac-8de8-4d8c-9a0d-6ef98f5b4460 req-1f42115a-e1cb-4ba1-869a-3181e8dd0584 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Updated VIF entry in instance network info cache for port b4dd4d81-f47d-472f-bf94-5869a2eb2f67. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.142 186844 DEBUG nova.network.neutron [req-4d4386ac-8de8-4d8c-9a0d-6ef98f5b4460 req-1f42115a-e1cb-4ba1-869a-3181e8dd0584 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Updating instance_info_cache with network_info: [{"id": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "address": "fa:16:3e:42:6c:83", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4dd4d81-f4", "ovs_interfaceid": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.166 186844 DEBUG oslo_concurrency.lockutils [req-4d4386ac-8de8-4d8c-9a0d-6ef98f5b4460 req-1f42115a-e1cb-4ba1-869a-3181e8dd0584 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-1c46d320-cd4f-40ea-ba30-d030e4b745b3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.451 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212682.4508073, 1c46d320-cd4f-40ea-ba30-d030e4b745b3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.451 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] VM Started (Lifecycle Event)
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.454 186844 DEBUG nova.compute.manager [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.457 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.460 186844 INFO nova.virt.libvirt.driver [-] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Instance spawned successfully.
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.461 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.480 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.489 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.492 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.493 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.493 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.493 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.494 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.494 186844 DEBUG nova.virt.libvirt.driver [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.508 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.508 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212682.4509127, 1c46d320-cd4f-40ea-ba30-d030e4b745b3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.508 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] VM Paused (Lifecycle Event)
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.532 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.536 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212682.4569967, 1c46d320-cd4f-40ea-ba30-d030e4b745b3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.536 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] VM Resumed (Lifecycle Event)
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.564 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.568 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.579 186844 INFO nova.compute.manager [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Took 6.27 seconds to spawn the instance on the hypervisor.
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.580 186844 DEBUG nova.compute.manager [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.590 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.646 186844 INFO nova.compute.manager [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Took 6.84 seconds to build instance.
Feb 27 17:18:02 compute-0 nova_compute[186840]: 2026-02-27 17:18:02.662 186844 DEBUG oslo_concurrency.lockutils [None req-724f610e-e0df-44c5-abde-4e354b78f5de 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.957s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:03 compute-0 nova_compute[186840]: 2026-02-27 17:18:03.886 186844 DEBUG nova.compute.manager [req-fd73c105-cc54-40fa-bc72-d1f6e30003de req-fc9b54cb-9eec-4543-b597-44ac5598a606 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Received event network-vif-plugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:03 compute-0 nova_compute[186840]: 2026-02-27 17:18:03.887 186844 DEBUG oslo_concurrency.lockutils [req-fd73c105-cc54-40fa-bc72-d1f6e30003de req-fc9b54cb-9eec-4543-b597-44ac5598a606 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:03 compute-0 nova_compute[186840]: 2026-02-27 17:18:03.888 186844 DEBUG oslo_concurrency.lockutils [req-fd73c105-cc54-40fa-bc72-d1f6e30003de req-fc9b54cb-9eec-4543-b597-44ac5598a606 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:03 compute-0 nova_compute[186840]: 2026-02-27 17:18:03.888 186844 DEBUG oslo_concurrency.lockutils [req-fd73c105-cc54-40fa-bc72-d1f6e30003de req-fc9b54cb-9eec-4543-b597-44ac5598a606 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:03 compute-0 nova_compute[186840]: 2026-02-27 17:18:03.888 186844 DEBUG nova.compute.manager [req-fd73c105-cc54-40fa-bc72-d1f6e30003de req-fc9b54cb-9eec-4543-b597-44ac5598a606 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] No waiting events found dispatching network-vif-plugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:18:03 compute-0 nova_compute[186840]: 2026-02-27 17:18:03.889 186844 WARNING nova.compute.manager [req-fd73c105-cc54-40fa-bc72-d1f6e30003de req-fc9b54cb-9eec-4543-b597-44ac5598a606 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Received unexpected event network-vif-plugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 for instance with vm_state active and task_state None.
Feb 27 17:18:04 compute-0 nova_compute[186840]: 2026-02-27 17:18:04.617 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.270 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'name': 'tempest-TestNetworkBasicOps-server-968808385', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000007', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0922444e0aaf445884a7c2fa20793b1f', 'user_id': '427d6e526715473ebe8997007bbff5cd', 'hostId': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.274 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'name': 'tempest-TestNetworkBasicOps-server-1831329056', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000006', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0922444e0aaf445884a7c2fa20793b1f', 'user_id': '427d6e526715473ebe8997007bbff5cd', 'hostId': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.274 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.305 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.read.latency volume: 1575890217 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.306 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.read.latency volume: 519083 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.334 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.read.latency volume: 595286912 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.335 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.read.latency volume: 52421290 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ddf19f1a-5fcb-4439-9614-15f2c525de38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1575890217, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-vda', 'timestamp': '2026-02-27T17:18:05.275089', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '473f1026-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': 'fa308199d8d024691849106d88bdc73c231b3c5dae59cbfc415efe6de0e87b07'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 519083, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-sda', 'timestamp': '2026-02-27T17:18:05.275089', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '473f2e12-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': 'f31ecccfcebc77162aa3cd23c7b5ecc13e8a6222b0daf9a67f03538d80d944c2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 595286912, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-vda', 'timestamp': '2026-02-27T17:18:05.275089', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47437120-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': '443014b5742bcaabb99c13bbe8a8dddce2ba8a8a634a126383d9bedd70547d08'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 52421290, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-sda', 'timestamp': '2026-02-27T17:18:05.275089', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47438f70-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': '457d3e0c62a7802c9a74d856228b5ab1c9518913dc1f029d47ba45c64fb27413'}]}, 'timestamp': '2026-02-27 17:18:05.336533', '_unique_id': '9814e94cb13448579d9bcb9fc9a35b93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.338 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.343 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.344 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.345 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.read.bytes volume: 31271424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.346 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f22c2005-084f-4717-a7ec-93b857cd8286', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-vda', 'timestamp': '2026-02-27T17:18:05.343581', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4744c9bc-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': '2b08165f50e25c4b028581ffffe9426743aaa3218bf7ac7a9cfef081d63a8298'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-sda', 'timestamp': '2026-02-27T17:18:05.343581', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4744e79e-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': '2f3ca41da2248bb886786a8939cc190b5f922e651ae8b6f38a0955b14d3da054'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31271424, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-vda', 'timestamp': '2026-02-27T17:18:05.343581', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '474508aa-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': '827a6a435feffa1f78b5ed677c039c8e2f3eecef9b7b77b3ae1115ebeb9c46e9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-sda', 'timestamp': '2026-02-27T17:18:05.343581', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47452114-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': '67dee3936df748be3812bc3ad78f03aa05667bc2991149979cf971ebf1f0c4e0'}]}, 'timestamp': '2026-02-27 17:18:05.346673', '_unique_id': 'f09ec17d113b454a96661718effffd9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.347 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.353 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.356 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 1c46d320-cd4f-40ea-ba30-d030e4b745b3 / tapb4dd4d81-f4 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.357 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.360 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f087df93-6b03-417d-bc8b-7114adfa61a4 / tapddbe59f7-46 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.361 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f087df93-6b03-417d-bc8b-7114adfa61a4 / tap3e6806d8-8d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.361 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.outgoing.packets volume: 149 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.362 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.outgoing.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9d33cbd9-16b0-4131-9635-b9893a1dcb1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000007-1c46d320-cd4f-40ea-ba30-d030e4b745b3-tapb4dd4d81-f4', 'timestamp': '2026-02-27T17:18:05.353624', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'tapb4dd4d81-f4', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:6c:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4dd4d81-f4'}, 'message_id': '4746e1c0-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.898633863, 'message_signature': 'd3144b4e7c1f8f0b22c15b3a3b6af1af1c9d52659246f34814660e7b5d1b06db'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 149, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tapddbe59f7-46', 'timestamp': '2026-02-27T17:18:05.353624', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tapddbe59f7-46', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:ca:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapddbe59f7-46'}, 'message_id': '474790e8-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': '4735fc2801ca02afd41e1882507dd04d2b8298fb76fb4da18fcc2a6456ff8f1d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tap3e6806d8-8d', 'timestamp': '2026-02-27T17:18:05.353624', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tap3e6806d8-8d', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:c2:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e6806d8-8d'}, 'message_id': '4747a9c0-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': '03f3254c3c50e2c70fa5c7307b95a3c8caa40ccbb4f454aff356b70c60304ac1'}]}, 'timestamp': '2026-02-27 17:18:05.363327', '_unique_id': '5636d1f28fb549c188dbd8006e573247'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.364 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.369 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.370 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.370 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-968808385>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1831329056>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-968808385>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1831329056>]
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.371 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.372 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.incoming.bytes volume: 27965 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.372 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.incoming.bytes volume: 1414 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91f9fcdb-1707-4761-9af0-edfbcf179236', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000007-1c46d320-cd4f-40ea-ba30-d030e4b745b3-tapb4dd4d81-f4', 'timestamp': '2026-02-27T17:18:05.371199', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'tapb4dd4d81-f4', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:6c:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4dd4d81-f4'}, 'message_id': '4748ff78-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.898633863, 'message_signature': 'b33cba16be465482b8f9ea3353ff4bc98125e692adc74920a4153eb3371abc15'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27965, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tapddbe59f7-46', 'timestamp': '2026-02-27T17:18:05.371199', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tapddbe59f7-46', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:ca:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapddbe59f7-46'}, 'message_id': '47491a44-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': '703bfbdff7c6baab1b8df3efdc7a59b8f18f2a161027627cdedd91f53358425d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1414, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tap3e6806d8-8d', 'timestamp': '2026-02-27T17:18:05.371199', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tap3e6806d8-8d', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:c2:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e6806d8-8d'}, 'message_id': '47493650-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': 'c598e949f289919362780e8da62649cbd60f70533aa4b747dfc21043879ca4e1'}]}, 'timestamp': '2026-02-27 17:18:05.373474', '_unique_id': 'ac073ac35e5c49fd88906dbca8754438'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.374 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.377 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.378 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.378 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.379 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13c55412-e857-4c7c-b815-6e43a421ac59', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000007-1c46d320-cd4f-40ea-ba30-d030e4b745b3-tapb4dd4d81-f4', 'timestamp': '2026-02-27T17:18:05.378234', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'tapb4dd4d81-f4', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:6c:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4dd4d81-f4'}, 'message_id': '474a0b8e-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.898633863, 'message_signature': 'dfc818bfe958e8fccb3078ba17e932a3225f4034ad0f5c7115c8ebcbc5368527'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tapddbe59f7-46', 'timestamp': '2026-02-27T17:18:05.378234', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tapddbe59f7-46', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:ca:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapddbe59f7-46'}, 'message_id': '474a1c5a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': '1334f12dd00280394f56f4cc41afddf699b58c9ae2cce24662a438fa8f84ebec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tap3e6806d8-8d', 'timestamp': '2026-02-27T17:18:05.378234', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tap3e6806d8-8d', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:c2:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e6806d8-8d'}, 'message_id': '474a2c5e-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': 'f1c94c19b87c1f861a0588339d007f9eaec950026abac79429e72c6022427342'}]}, 'timestamp': '2026-02-27 17:18:05.379676', '_unique_id': 'ed2cbf35aeb74105963e44bb855f2535'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.381 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.382 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.382 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.382 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.383 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc8e653c-0ee3-43d2-9655-b512983092b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000007-1c46d320-cd4f-40ea-ba30-d030e4b745b3-tapb4dd4d81-f4', 'timestamp': '2026-02-27T17:18:05.382576', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'tapb4dd4d81-f4', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:6c:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4dd4d81-f4'}, 'message_id': '474aabe8-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.898633863, 'message_signature': 'df2c2a8092ef0d6f9f38945a70c804bc595046aa2f43305ff9321b8db87388d6'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tapddbe59f7-46', 'timestamp': '2026-02-27T17:18:05.382576', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tapddbe59f7-46', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:ca:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapddbe59f7-46'}, 'message_id': '474ab73c-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': '42b767a0222b076ee358735fd135f37c833518f5573ac56f33e10a801c23bbb8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tap3e6806d8-8d', 'timestamp': '2026-02-27T17:18:05.382576', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tap3e6806d8-8d', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:c2:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e6806d8-8d'}, 'message_id': '474ac2f4-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': '6fe3ed04a5282f17cad19a4db3916be0a801c34f6d67114d44c8560684354edb'}]}, 'timestamp': '2026-02-27 17:18:05.383486', '_unique_id': 'bad378768b5c4dcf99c409ecad6e3ce0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.384 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.385 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.397 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.397 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.409 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.usage volume: 30081024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.410 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9a7b012-d591-4567-84df-54b49fdb5960', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-vda', 'timestamp': '2026-02-27T17:18:05.385211', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '474cf2f4-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.930103068, 'message_signature': 'b3f25933186433c059d91e3f48be5cdc532f8553ac35c67cf59a643e10d1db6c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-sda', 'timestamp': '2026-02-27T17:18:05.385211', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '474d0118-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.930103068, 'message_signature': 'ae6789ecab44389ae1a35766ac5d2fe230e43d4d15f7e888afa9c1972ebfe65a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30081024, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-vda', 'timestamp': '2026-02-27T17:18:05.385211', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '474ed916-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.94298688, 'message_signature': 'f47f99d1cd03fb12d767017ab619f25819bbc70e87189904c24947d7621354cd'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-sda', 'timestamp': '2026-02-27T17:18:05.385211', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '474ef02c-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.94298688, 'message_signature': 'a5235e94421f66ad49d58c4a7974dbcd3b45d6c32ef82ac05e839ff62440b40f'}]}, 'timestamp': '2026-02-27 17:18:05.410972', '_unique_id': 'ec4bd63c3f93418eb9e76385da2fdeaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.413 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.414 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.436 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.437 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 1c46d320-cd4f-40ea-ba30-d030e4b745b3: ceilometer.compute.pollsters.NoVolumeException
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.454 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/memory.usage volume: 43.8046875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3658e68-5834-40b6-af59-dc8130d1aee0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.8046875, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'timestamp': '2026-02-27T17:18:05.415040', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '4755a868-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.999201273, 'message_signature': '793ffe5d6a1f16f8ddb735f9fbaba160f8f1b7c67e46e4827d9dc8a49c2b59dd'}]}, 'timestamp': '2026-02-27 17:18:05.455008', '_unique_id': '417e98b98ea8461eb132db09a40abad9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.456 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.457 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.458 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.458 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.458 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.read.requests volume: 1151 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.458 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '139bf822-bf17-41ee-9e49-6eec2b8ff01e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-vda', 'timestamp': '2026-02-27T17:18:05.457975', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47562d4c-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': '9548f648b6d2641d0de847a8341e4d998afacbcb6f6fe09818135d14e6e3b4ef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-sda', 'timestamp': '2026-02-27T17:18:05.457975', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47563b48-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': 'de51f4396db2fd4d2789a6a91ff6c1a5114c2946b5980f61e25cf3c9d4681f0d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1151, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-vda', 'timestamp': '2026-02-27T17:18:05.457975', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47564606-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': 'a9a0206098aa6802f4c24c440a70f310b12ac2c0a5a4a9c42b4f0364f5de381a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-sda', 'timestamp': '2026-02-27T17:18:05.457975', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47565010-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': 'bc7e1c39a4ba5b9792b613683f387205323913df1a90913d9b759f4da36d3d94'}]}, 'timestamp': '2026-02-27 17:18:05.459180', '_unique_id': '85d091bcac6f47bab05aa6a92e68fbb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.459 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.460 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.460 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.461 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.outgoing.bytes volume: 23808 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.461 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.outgoing.bytes volume: 1494 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7daa2f36-3a3b-45cc-8ef2-68150420c732', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000007-1c46d320-cd4f-40ea-ba30-d030e4b745b3-tapb4dd4d81-f4', 'timestamp': '2026-02-27T17:18:05.460916', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'tapb4dd4d81-f4', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:6c:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4dd4d81-f4'}, 'message_id': '4756a0c4-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.898633863, 'message_signature': '32cbd05a0b4c943ed27c64f0e2bf8d98c15a5d928ec5d767780bb8bbc6e0b2a5'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23808, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tapddbe59f7-46', 'timestamp': '2026-02-27T17:18:05.460916', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tapddbe59f7-46', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:ca:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapddbe59f7-46'}, 'message_id': '4756aec0-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': '182cfd66308f98cf89af2c22902465b59981b00541695ebf7b9eb0683b67d394'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1494, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tap3e6806d8-8d', 'timestamp': '2026-02-27T17:18:05.460916', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tap3e6806d8-8d', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:c2:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e6806d8-8d'}, 'message_id': '4756b992-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': 'a5b3d52c722b6801f106fd2653b6dd09bec726b995d4f7a25a42095e57f72dcd'}]}, 'timestamp': '2026-02-27 17:18:05.461893', '_unique_id': '695eeb89d2c84f3eb92ce8bef56d3908'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.462 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.463 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.463 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.464 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.464 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3aadb19-ffb5-478e-8841-d0f9bc27a0e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000007-1c46d320-cd4f-40ea-ba30-d030e4b745b3-tapb4dd4d81-f4', 'timestamp': '2026-02-27T17:18:05.463608', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'tapb4dd4d81-f4', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:6c:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4dd4d81-f4'}, 'message_id': '47570848-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.898633863, 'message_signature': 'c5dc62d7cf263f8e0bb9028d7a383273fc8b405bf9a54da43a6a95ed85cf444f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tapddbe59f7-46', 'timestamp': '2026-02-27T17:18:05.463608', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tapddbe59f7-46', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:ca:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapddbe59f7-46'}, 'message_id': '47571770-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': 'd5035cc7cfc074131a19a4d25bbec5174f902b6bee6bb103b90e9eb0f6486865'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tap3e6806d8-8d', 'timestamp': '2026-02-27T17:18:05.463608', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tap3e6806d8-8d', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:c2:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e6806d8-8d'}, 'message_id': '47572364-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': 'e7c8c67bc7ce70f5611076493705e523efa9dd4627290ceb49d311793e9844cd'}]}, 'timestamp': '2026-02-27 17:18:05.464624', '_unique_id': '7af215f299bc43868d6df6969d44ddd1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.465 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.466 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.466 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.466 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-968808385>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1831329056>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-968808385>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1831329056>]
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.466 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.466 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.466 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-968808385>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1831329056>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-968808385>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1831329056>]
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.466 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.467 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.467 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.467 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.write.latency volume: 9615611387 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.467 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ef884c8-ffa1-4b94-952c-fb61d0c5e92e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-vda', 'timestamp': '2026-02-27T17:18:05.466995', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47578ca0-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': '48355839d9aa4353721d53ce31b19a65151d6d8981e1d81758f21311ef7aa991'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-sda', 'timestamp': '2026-02-27T17:18:05.466995', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47579858-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': '6171183fa6535a389cd3c72dc265425351208d3725ee53caf5fa48751564e32e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9615611387, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-vda', 'timestamp': '2026-02-27T17:18:05.466995', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4757a334-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': '2b9d3e31ec6cafab805c9d33328207cab301d7f34cdccd7279693cafe41ad072'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-sda', 'timestamp': '2026-02-27T17:18:05.466995', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4757ae56-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': 'e10723854d14ee3e58477e17d0c78048ac59b86df786aaf0a018789d60c7cd9a'}]}, 'timestamp': '2026-02-27 17:18:05.468145', '_unique_id': 'd47de6d50cac4d57bcfcac9afe709ba0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.468 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.469 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.469 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.470 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.470 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c3b73d2a-0617-4076-808f-70e664c6d0e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000007-1c46d320-cd4f-40ea-ba30-d030e4b745b3-tapb4dd4d81-f4', 'timestamp': '2026-02-27T17:18:05.469732', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'tapb4dd4d81-f4', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:6c:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4dd4d81-f4'}, 'message_id': '4757f762-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.898633863, 'message_signature': '81423cd6c3eee0199e17acb85261c4bed1b772f30c5d0ab9e9380b3335e011f7'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tapddbe59f7-46', 'timestamp': '2026-02-27T17:18:05.469732', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tapddbe59f7-46', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:ca:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapddbe59f7-46'}, 'message_id': '4758028e-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': 'f34a836bb811c61896ba420a8c478ecf01b73f49183eb3cab80dc30dc9e41c2f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tap3e6806d8-8d', 'timestamp': '2026-02-27T17:18:05.469732', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tap3e6806d8-8d', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:c2:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e6806d8-8d'}, 'message_id': '47580e5a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': '4666e1d9da4a0d50d6bd9a18c77ad28b71d57567d022f48e9a1c1d2fd69fa0e7'}]}, 'timestamp': '2026-02-27 17:18:05.470612', '_unique_id': '7b2de1a428d14cb09cc2c3332a300ad1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.471 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.472 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.472 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.472 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.472 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.allocation volume: 31072256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cefd73fb-9886-4c3e-989c-12b7d2714f93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-vda', 'timestamp': '2026-02-27T17:18:05.472143', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47585824-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.930103068, 'message_signature': '03b013ed3d9007fa3e0e43b2f4b06449be5325d76563bc6a748c134bbdc14dd6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-sda', 'timestamp': '2026-02-27T17:18:05.472143', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4758644a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.930103068, 'message_signature': '7773175ed7a76d330eed9c77b126bb60a3c8e94f63263fdb06293bf7ba655306'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 31072256, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-vda', 'timestamp': '2026-02-27T17:18:05.472143', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '47586e9a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.94298688, 'message_signature': '9fcfa795793ce485ff30c53d37954249e7f48fef917efac682a956ea56bf3559'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-sda', 'timestamp': '2026-02-27T17:18:05.472143', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '47587868-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.94298688, 'message_signature': '671ebe7515d6faf45f91be01c336866267fa7d65124c40a6912db320e608799b'}]}, 'timestamp': '2026-02-27 17:18:05.473344', '_unique_id': '9ff5f5de698e4dc6b11fbfe90e5bbeea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.473 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.474 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.474 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.475 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.475 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.write.requests volume: 314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.475 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '142ebffa-fdf9-46fd-a730-262d23bea1e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-vda', 'timestamp': '2026-02-27T17:18:05.474829', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4758be72-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': 'ca97ab0c6b0c5cf112536374cdb51cc04a798ec7763dea311b6ea6a5dd91de87'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-sda', 'timestamp': '2026-02-27T17:18:05.474829', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4758cad4-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': 'd3933e2ad13268b05bce305630f843081e90dc16d0f35e0137abc6968a17d8fe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 314, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-vda', 'timestamp': '2026-02-27T17:18:05.474829', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4758d60a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': '19a795e04c0cb1cf8df992c1eff5662abb956e590d24b190bbfe9f4a29e8765a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-sda', 'timestamp': '2026-02-27T17:18:05.474829', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4758e014-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': 'ec4e1e4c40d0f339d7da6c8f49ce4f39cdcb027052a8027e9237de24a1f1a862'}]}, 'timestamp': '2026-02-27 17:18:05.475994', '_unique_id': 'ee989639c7fe46a9ae298f4810bd9bb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.476 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.477 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.477 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.477 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.478 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe497de0-c817-4cdd-b30a-53566638a772', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000007-1c46d320-cd4f-40ea-ba30-d030e4b745b3-tapb4dd4d81-f4', 'timestamp': '2026-02-27T17:18:05.477490', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'tapb4dd4d81-f4', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:6c:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4dd4d81-f4'}, 'message_id': '4759274a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.898633863, 'message_signature': '9e6278e3f1249a3e052336497111f675ebe4e67e845ca32780cf5d46e3243e36'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tapddbe59f7-46', 'timestamp': '2026-02-27T17:18:05.477490', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tapddbe59f7-46', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:ca:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapddbe59f7-46'}, 'message_id': '475932b2-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': 'd95ed951b1b8b012dc2433c771c496b392bfa7b48d0062aa0da31b4c597cf214'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tap3e6806d8-8d', 'timestamp': '2026-02-27T17:18:05.477490', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tap3e6806d8-8d', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:c2:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e6806d8-8d'}, 'message_id': '47593d20-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': '726084e9aa4e9a245cc71c360c6d7304981b06474babf6f168a12a93a6f52727'}]}, 'timestamp': '2026-02-27 17:18:05.478384', '_unique_id': '274da84046bd442dbbb50c59d2698c5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.479 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.480 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.480 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8588fff8-84e5-43cb-b242-1a6bffd65c94', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000007-1c46d320-cd4f-40ea-ba30-d030e4b745b3-tapb4dd4d81-f4', 'timestamp': '2026-02-27T17:18:05.479820', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'tapb4dd4d81-f4', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:6c:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4dd4d81-f4'}, 'message_id': '4759846a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.898633863, 'message_signature': '9f8c8b4982d6c3a55e77b14df20662b0e040986be960e1bac5246a1ed4be7fed'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tapddbe59f7-46', 'timestamp': '2026-02-27T17:18:05.479820', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tapddbe59f7-46', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:ca:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapddbe59f7-46'}, 'message_id': '47599324-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': 'c70c32a50cb8aeb65ca5b2f2cb502711c1d3dda16ae44563ea0cd6452269eb79'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tap3e6806d8-8d', 'timestamp': '2026-02-27T17:18:05.479820', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tap3e6806d8-8d', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:c2:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e6806d8-8d'}, 'message_id': '4759a0c6-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': '118c795835d22e6c8a05d56ff8a8366e0146b98db833b5aadc19461a8d19c8b0'}]}, 'timestamp': '2026-02-27 17:18:05.480917', '_unique_id': '590f9cf3b2414b82b95b18c0b0b5f0d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.481 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.482 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.482 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.482 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-968808385>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1831329056>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-968808385>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1831329056>]
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.482 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.483 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.483 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.483 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.483 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c475d1c8-bb3e-4a8c-a804-ca744bc66465', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-vda', 'timestamp': '2026-02-27T17:18:05.483054', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '475a00f2-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.930103068, 'message_signature': 'e64c402e91eab0da5d831f268c711116d2645ef10275a2e62a78df5fa22ff427'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-sda', 'timestamp': '2026-02-27T17:18:05.483054', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '475a0cf0-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.930103068, 'message_signature': 'c82e88b2e52abc62e58f11fe6ab0dbdeff383de8e2d9c19fd174a0cdd18c2858'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-vda', 'timestamp': '2026-02-27T17:18:05.483054', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '475a17cc-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.94298688, 'message_signature': '8cebf4f01ca7e531ee144a622f5a91aa7ed292ec0b568be0ca631e8a6e7ba227'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-sda', 'timestamp': '2026-02-27T17:18:05.483054', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '475a23ca-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.94298688, 'message_signature': '799f4cee5a25db2dfae1eae0443b8fb2ac8a0b946176c6526591dbddb01f7738'}]}, 'timestamp': '2026-02-27 17:18:05.484292', '_unique_id': '39a34adbe7634168aa57c62f897574a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.484 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.485 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.485 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.486 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.incoming.packets volume: 147 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.486 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/network.incoming.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a8b8024-fcb7-40d2-8a5e-da3e392c742a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000007-1c46d320-cd4f-40ea-ba30-d030e4b745b3-tapb4dd4d81-f4', 'timestamp': '2026-02-27T17:18:05.485928', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'tapb4dd4d81-f4', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:42:6c:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb4dd4d81-f4'}, 'message_id': '475a7280-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.898633863, 'message_signature': 'd9a9fa39d38c6caa94c051c50103daf991b8dd5956c2e0f51d137250c8429935'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 147, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tapddbe59f7-46', 'timestamp': '2026-02-27T17:18:05.485928', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tapddbe59f7-46', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c8:ca:23', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapddbe59f7-46'}, 'message_id': '475a7f3c-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': 'd5860e863054e907ed29fc8d690b6a4f75ec735c0ba3e78c11c80ef949bad4d5'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-00000006-f087df93-6b03-417d-bc8b-7114adfa61a4-tap3e6806d8-8d', 'timestamp': '2026-02-27T17:18:05.485928', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'tap3e6806d8-8d', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:63:c2:e1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e6806d8-8d'}, 'message_id': '475a8a40-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.903059853, 'message_signature': '9c4c559f9f326613f970a68f334f699d7ee88b61d680f3efe5977160f8a0d6f8'}]}, 'timestamp': '2026-02-27 17:18:05.486891', '_unique_id': '999b4a0a71d04cd6adfe1815208d0e00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.487 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.488 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.488 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.489 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.489 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.write.bytes volume: 73072640 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.489 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bef38049-e2ff-4cf0-ad5a-3497884298fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-vda', 'timestamp': '2026-02-27T17:18:05.488757', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '475ade50-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': '6ca11207003ac3bc6177e380383acb302ee9001c9591b234a7a00aab20496b71'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3-sda', 'timestamp': '2026-02-27T17:18:05.488757', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '475ae8e6-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.819927409, 'message_signature': 'f904eb299d75095677e36d9d568cd1eebbfe7842f0e41060e7c6ea9b220947e6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73072640, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-vda', 'timestamp': '2026-02-27T17:18:05.488757', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '475af4e4-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': '13d4ece38354b79f7d731985a1b9d1bef383b9ee9f95325d878b6a258baca99d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4-sda', 'timestamp': '2026-02-27T17:18:05.488757', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '475affac-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.852488341, 'message_signature': '3c1a16cc6e38fb9a536101e1c60e79f5ab931bf8d49f82ed600a7797b5a78da2'}]}, 'timestamp': '2026-02-27 17:18:05.489885', '_unique_id': 'c77b5420c23d4079844e751b76b428df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.490 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.491 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.491 12 DEBUG ceilometer.compute.pollsters [-] 1c46d320-cd4f-40ea-ba30-d030e4b745b3/cpu volume: 2830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.491 12 DEBUG ceilometer.compute.pollsters [-] f087df93-6b03-417d-bc8b-7114adfa61a4/cpu volume: 10160000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '688d14b5-96f0-403a-b478-695b314b8aaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2830000000, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'timestamp': '2026-02-27T17:18:05.491376', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-968808385', 'name': 'instance-00000007', 'instance_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '475b457a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.981602454, 'message_signature': '16e40d3a6af90657ee0b64e088c1b88a33f488a8d7b377b24807b04b149ca07f'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10160000000, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'timestamp': '2026-02-27T17:18:05.491376', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1831329056', 'name': 'instance-00000006', 'instance_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '475b55e2-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3560.999201273, 'message_signature': 'aa844d905c00eb0ee1f4a87778d8c244239ceec39a1f778ecf971951529a1d55'}]}, 'timestamp': '2026-02-27 17:18:05.492137', '_unique_id': '76a6710cd0b245af9fd97c44470cad89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:18:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:18:05.492 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:18:05 compute-0 nova_compute[186840]: 2026-02-27 17:18:05.523 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:07 compute-0 podman[218630]: 2026-02-27 17:18:07.680132465 +0000 UTC m=+0.075300230 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:18:08 compute-0 nova_compute[186840]: 2026-02-27 17:18:08.720 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:18:09 compute-0 nova_compute[186840]: 2026-02-27 17:18:09.622 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:09 compute-0 podman[218654]: 2026-02-27 17:18:09.662045334 +0000 UTC m=+0.071460544 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:18:10 compute-0 nova_compute[186840]: 2026-02-27 17:18:10.527 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:10 compute-0 nova_compute[186840]: 2026-02-27 17:18:10.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:18:12 compute-0 nova_compute[186840]: 2026-02-27 17:18:12.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:18:12 compute-0 nova_compute[186840]: 2026-02-27 17:18:12.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:18:12 compute-0 nova_compute[186840]: 2026-02-27 17:18:12.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:18:13 compute-0 podman[218674]: 2026-02-27 17:18:13.681302527 +0000 UTC m=+0.085426023 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git)
Feb 27 17:18:13 compute-0 podman[218675]: 2026-02-27 17:18:13.70147574 +0000 UTC m=+0.096488629 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0)
Feb 27 17:18:13 compute-0 nova_compute[186840]: 2026-02-27 17:18:13.714 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:18:13 compute-0 nova_compute[186840]: 2026-02-27 17:18:13.715 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquired lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:18:13 compute-0 nova_compute[186840]: 2026-02-27 17:18:13.715 186844 DEBUG nova.network.neutron [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 27 17:18:13 compute-0 nova_compute[186840]: 2026-02-27 17:18:13.715 186844 DEBUG nova.objects.instance [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f087df93-6b03-417d-bc8b-7114adfa61a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:18:14 compute-0 nova_compute[186840]: 2026-02-27 17:18:14.623 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:15 compute-0 nova_compute[186840]: 2026-02-27 17:18:15.529 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:16 compute-0 ovn_controller[96756]: 2026-02-27T17:18:16Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:42:6c:83 10.100.0.20
Feb 27 17:18:16 compute-0 ovn_controller[96756]: 2026-02-27T17:18:16Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:42:6c:83 10.100.0.20
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.048 186844 DEBUG nova.network.neutron [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updating instance_info_cache with network_info: [{"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.075 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Releasing lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.076 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.077 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.077 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.078 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.102 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.102 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.103 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.103 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.225 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.331 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk --force-share --output=json" returned: 0 in 0.106s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.332 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.398 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.406 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.481 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.482 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.538 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.778 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.780 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5416MB free_disk=73.13668823242188GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.781 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.781 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.905 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Instance f087df93-6b03-417d-bc8b-7114adfa61a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.905 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Instance 1c46d320-cd4f-40ea-ba30-d030e4b745b3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.906 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.906 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.931 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Refreshing inventories for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.956 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Updating ProviderTree inventory for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.956 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Updating inventory in ProviderTree for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 27 17:18:17 compute-0 nova_compute[186840]: 2026-02-27 17:18:17.985 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Refreshing aggregate associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 27 17:18:18 compute-0 nova_compute[186840]: 2026-02-27 17:18:18.015 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Refreshing trait associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, traits: HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AESNI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_BMI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 27 17:18:18 compute-0 nova_compute[186840]: 2026-02-27 17:18:18.088 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:18:18 compute-0 nova_compute[186840]: 2026-02-27 17:18:18.113 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:18:18 compute-0 nova_compute[186840]: 2026-02-27 17:18:18.147 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:18:18 compute-0 nova_compute[186840]: 2026-02-27 17:18:18.148 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.367s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:19 compute-0 nova_compute[186840]: 2026-02-27 17:18:19.144 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:18:19 compute-0 nova_compute[186840]: 2026-02-27 17:18:19.145 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:18:19 compute-0 nova_compute[186840]: 2026-02-27 17:18:19.146 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:18:19 compute-0 nova_compute[186840]: 2026-02-27 17:18:19.146 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:18:19 compute-0 nova_compute[186840]: 2026-02-27 17:18:19.627 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:20 compute-0 nova_compute[186840]: 2026-02-27 17:18:20.531 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:21 compute-0 podman[218755]: 2026-02-27 17:18:21.674368927 +0000 UTC m=+0.079543676 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 27 17:18:24 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:24.074 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:18:24 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:24.076 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:18:24 compute-0 nova_compute[186840]: 2026-02-27 17:18:24.125 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:24 compute-0 nova_compute[186840]: 2026-02-27 17:18:24.234 186844 DEBUG nova.compute.manager [req-f880755d-85cc-444c-89d2-a6c34c78755b req-0d5225eb-607f-4252-843a-b69b06ed2a4d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-changed-3e6806d8-8dee-4392-befe-ef55f59117ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:24 compute-0 nova_compute[186840]: 2026-02-27 17:18:24.235 186844 DEBUG nova.compute.manager [req-f880755d-85cc-444c-89d2-a6c34c78755b req-0d5225eb-607f-4252-843a-b69b06ed2a4d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Refreshing instance network info cache due to event network-changed-3e6806d8-8dee-4392-befe-ef55f59117ce. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:18:24 compute-0 nova_compute[186840]: 2026-02-27 17:18:24.235 186844 DEBUG oslo_concurrency.lockutils [req-f880755d-85cc-444c-89d2-a6c34c78755b req-0d5225eb-607f-4252-843a-b69b06ed2a4d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:18:24 compute-0 nova_compute[186840]: 2026-02-27 17:18:24.236 186844 DEBUG oslo_concurrency.lockutils [req-f880755d-85cc-444c-89d2-a6c34c78755b req-0d5225eb-607f-4252-843a-b69b06ed2a4d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:18:24 compute-0 nova_compute[186840]: 2026-02-27 17:18:24.236 186844 DEBUG nova.network.neutron [req-f880755d-85cc-444c-89d2-a6c34c78755b req-0d5225eb-607f-4252-843a-b69b06ed2a4d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Refreshing network info cache for port 3e6806d8-8dee-4392-befe-ef55f59117ce _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:18:24 compute-0 nova_compute[186840]: 2026-02-27 17:18:24.630 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:25 compute-0 nova_compute[186840]: 2026-02-27 17:18:25.429 186844 DEBUG nova.network.neutron [req-f880755d-85cc-444c-89d2-a6c34c78755b req-0d5225eb-607f-4252-843a-b69b06ed2a4d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updated VIF entry in instance network info cache for port 3e6806d8-8dee-4392-befe-ef55f59117ce. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:18:25 compute-0 nova_compute[186840]: 2026-02-27 17:18:25.430 186844 DEBUG nova.network.neutron [req-f880755d-85cc-444c-89d2-a6c34c78755b req-0d5225eb-607f-4252-843a-b69b06ed2a4d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updating instance_info_cache with network_info: [{"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:18:25 compute-0 nova_compute[186840]: 2026-02-27 17:18:25.450 186844 DEBUG oslo_concurrency.lockutils [req-f880755d-85cc-444c-89d2-a6c34c78755b req-0d5225eb-607f-4252-843a-b69b06ed2a4d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:18:25 compute-0 nova_compute[186840]: 2026-02-27 17:18:25.535 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:26 compute-0 podman[218775]: 2026-02-27 17:18:26.663603616 +0000 UTC m=+0.066820419 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 27 17:18:29 compute-0 nova_compute[186840]: 2026-02-27 17:18:29.633 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:30 compute-0 nova_compute[186840]: 2026-02-27 17:18:30.538 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:30 compute-0 nova_compute[186840]: 2026-02-27 17:18:30.896 186844 DEBUG oslo_concurrency.lockutils [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:30 compute-0 nova_compute[186840]: 2026-02-27 17:18:30.897 186844 DEBUG oslo_concurrency.lockutils [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:30 compute-0 nova_compute[186840]: 2026-02-27 17:18:30.898 186844 DEBUG oslo_concurrency.lockutils [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:30 compute-0 nova_compute[186840]: 2026-02-27 17:18:30.899 186844 DEBUG oslo_concurrency.lockutils [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:30 compute-0 nova_compute[186840]: 2026-02-27 17:18:30.899 186844 DEBUG oslo_concurrency.lockutils [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:30 compute-0 nova_compute[186840]: 2026-02-27 17:18:30.902 186844 INFO nova.compute.manager [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Terminating instance
Feb 27 17:18:30 compute-0 nova_compute[186840]: 2026-02-27 17:18:30.904 186844 DEBUG nova.compute.manager [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:18:30 compute-0 kernel: tapb4dd4d81-f4 (unregistering): left promiscuous mode
Feb 27 17:18:30 compute-0 NetworkManager[56537]: <info>  [1772212710.9277] device (tapb4dd4d81-f4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:18:30 compute-0 nova_compute[186840]: 2026-02-27 17:18:30.931 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:30 compute-0 ovn_controller[96756]: 2026-02-27T17:18:30Z|00098|binding|INFO|Releasing lport b4dd4d81-f47d-472f-bf94-5869a2eb2f67 from this chassis (sb_readonly=0)
Feb 27 17:18:30 compute-0 ovn_controller[96756]: 2026-02-27T17:18:30Z|00099|binding|INFO|Setting lport b4dd4d81-f47d-472f-bf94-5869a2eb2f67 down in Southbound
Feb 27 17:18:30 compute-0 ovn_controller[96756]: 2026-02-27T17:18:30Z|00100|binding|INFO|Removing iface tapb4dd4d81-f4 ovn-installed in OVS
Feb 27 17:18:30 compute-0 nova_compute[186840]: 2026-02-27 17:18:30.941 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:30 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:30.946 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:6c:83 10.100.0.20'], port_security=['fa:16:3e:42:6c:83 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': '1c46d320-cd4f-40ea-ba30-d030e4b745b3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-def28598-31df-42ab-92d1-b43c240c6127', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8fcac549-a7b8-49db-83c5-dc41c1c04185', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc893a01-f8c4-4bb0-8d1a-3097917beb15, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=b4dd4d81-f47d-472f-bf94-5869a2eb2f67) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:18:30 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:30.948 106085 INFO neutron.agent.ovn.metadata.agent [-] Port b4dd4d81-f47d-472f-bf94-5869a2eb2f67 in datapath def28598-31df-42ab-92d1-b43c240c6127 unbound from our chassis
Feb 27 17:18:30 compute-0 nova_compute[186840]: 2026-02-27 17:18:30.948 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:30 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:30.949 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network def28598-31df-42ab-92d1-b43c240c6127
Feb 27 17:18:30 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:30.964 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[e473004f-2218-43ec-80b1-beb2ca2614ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:30 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Deactivated successfully.
Feb 27 17:18:30 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000007.scope: Consumed 13.338s CPU time.
Feb 27 17:18:30 compute-0 systemd-machined[156136]: Machine qemu-7-instance-00000007 terminated.
Feb 27 17:18:30 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:30.995 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[f33c7f75-0d5c-4368-b4e9-1d1137c754d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:30 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:30.999 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[2705c2e0-58b0-4700-946d-0feef5f78b3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:31 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:31.021 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[531fc350-c835-46e4-909c-4a55f102813a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:31 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:31.035 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[306978e4-50cc-4369-9c82-3884dcaff15b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdef28598-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:03:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353828, 'reachable_time': 16604, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218812, 'error': None, 'target': 'ovnmeta-def28598-31df-42ab-92d1-b43c240c6127', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:31 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:31.052 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[267fc3bc-42b6-466c-b6ed-2ecd8dd87474]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.17'], ['IFA_LOCAL', '10.100.0.17'], ['IFA_BROADCAST', '10.100.0.31'], ['IFA_LABEL', 'tapdef28598-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353837, 'tstamp': 353837}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218813, 'error': None, 'target': 'ovnmeta-def28598-31df-42ab-92d1-b43c240c6127', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapdef28598-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 353839, 'tstamp': 353839}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218813, 'error': None, 'target': 'ovnmeta-def28598-31df-42ab-92d1-b43c240c6127', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:31 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:31.054 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdef28598-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.056 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.061 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:31 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:31.061 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdef28598-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:31 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:31.062 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:18:31 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:31.062 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdef28598-30, col_values=(('external_ids', {'iface-id': '35120014-f89b-47df-b6c5-52840b05e2b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:31 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:31.063 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:18:31 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:31.078 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.173 186844 INFO nova.virt.libvirt.driver [-] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Instance destroyed successfully.
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.174 186844 DEBUG nova.objects.instance [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid 1c46d320-cd4f-40ea-ba30-d030e4b745b3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.192 186844 DEBUG nova.virt.libvirt.vif [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:17:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-968808385',display_name='tempest-TestNetworkBasicOps-server-968808385',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-968808385',id=7,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBBLWrz8tmOcW2zvi/1U1wEEoJQxVItY9vmw9cOpNFjGqVQ0CR7tDuO9zIdatx4sFDb6uF9mxT7YB/H8L/5RbePm8Dg+nQ5GebZ8Zmxgd+vSy/lIcZmX6+EWMBMN7JWmuEw==',key_name='tempest-TestNetworkBasicOps-155566527',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:18:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-qeh0xuld',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:18:02Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=1c46d320-cd4f-40ea-ba30-d030e4b745b3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "address": "fa:16:3e:42:6c:83", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4dd4d81-f4", "ovs_interfaceid": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.192 186844 DEBUG nova.network.os_vif_util [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "address": "fa:16:3e:42:6c:83", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb4dd4d81-f4", "ovs_interfaceid": "b4dd4d81-f47d-472f-bf94-5869a2eb2f67", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.193 186844 DEBUG nova.network.os_vif_util [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:6c:83,bridge_name='br-int',has_traffic_filtering=True,id=b4dd4d81-f47d-472f-bf94-5869a2eb2f67,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4dd4d81-f4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.194 186844 DEBUG os_vif [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:6c:83,bridge_name='br-int',has_traffic_filtering=True,id=b4dd4d81-f47d-472f-bf94-5869a2eb2f67,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4dd4d81-f4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.197 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.198 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb4dd4d81-f4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.200 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.202 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.206 186844 INFO os_vif [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:6c:83,bridge_name='br-int',has_traffic_filtering=True,id=b4dd4d81-f47d-472f-bf94-5869a2eb2f67,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb4dd4d81-f4')
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.208 186844 INFO nova.virt.libvirt.driver [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Deleting instance files /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3_del
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.209 186844 INFO nova.virt.libvirt.driver [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Deletion of /var/lib/nova/instances/1c46d320-cd4f-40ea-ba30-d030e4b745b3_del complete
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.298 186844 INFO nova.compute.manager [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Took 0.39 seconds to destroy the instance on the hypervisor.
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.299 186844 DEBUG oslo.service.loopingcall [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.300 186844 DEBUG nova.compute.manager [-] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.300 186844 DEBUG nova.network.neutron [-] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.508 186844 DEBUG nova.compute.manager [req-fec58671-57a7-4aba-ac08-27e9eec33eca req-37717de8-2635-4d8b-84c6-e1bf2517c968 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Received event network-vif-unplugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.509 186844 DEBUG oslo_concurrency.lockutils [req-fec58671-57a7-4aba-ac08-27e9eec33eca req-37717de8-2635-4d8b-84c6-e1bf2517c968 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.510 186844 DEBUG oslo_concurrency.lockutils [req-fec58671-57a7-4aba-ac08-27e9eec33eca req-37717de8-2635-4d8b-84c6-e1bf2517c968 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.510 186844 DEBUG oslo_concurrency.lockutils [req-fec58671-57a7-4aba-ac08-27e9eec33eca req-37717de8-2635-4d8b-84c6-e1bf2517c968 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.511 186844 DEBUG nova.compute.manager [req-fec58671-57a7-4aba-ac08-27e9eec33eca req-37717de8-2635-4d8b-84c6-e1bf2517c968 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] No waiting events found dispatching network-vif-unplugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:18:31 compute-0 nova_compute[186840]: 2026-02-27 17:18:31.511 186844 DEBUG nova.compute.manager [req-fec58671-57a7-4aba-ac08-27e9eec33eca req-37717de8-2635-4d8b-84c6-e1bf2517c968 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Received event network-vif-unplugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 27 17:18:32 compute-0 nova_compute[186840]: 2026-02-27 17:18:32.397 186844 DEBUG nova.network.neutron [-] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:18:32 compute-0 nova_compute[186840]: 2026-02-27 17:18:32.421 186844 INFO nova.compute.manager [-] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Took 1.12 seconds to deallocate network for instance.
Feb 27 17:18:32 compute-0 nova_compute[186840]: 2026-02-27 17:18:32.468 186844 DEBUG oslo_concurrency.lockutils [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:32 compute-0 nova_compute[186840]: 2026-02-27 17:18:32.469 186844 DEBUG oslo_concurrency.lockutils [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:32 compute-0 nova_compute[186840]: 2026-02-27 17:18:32.490 186844 DEBUG nova.compute.manager [req-175ec3ed-a9d2-41fc-b817-ac0b5334066c req-adfa88d4-8139-4196-bdf8-dce0ca609aa5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Received event network-vif-deleted-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:32 compute-0 nova_compute[186840]: 2026-02-27 17:18:32.555 186844 DEBUG nova.compute.provider_tree [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:18:32 compute-0 nova_compute[186840]: 2026-02-27 17:18:32.575 186844 DEBUG nova.scheduler.client.report [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:18:32 compute-0 nova_compute[186840]: 2026-02-27 17:18:32.602 186844 DEBUG oslo_concurrency.lockutils [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:32 compute-0 nova_compute[186840]: 2026-02-27 17:18:32.631 186844 INFO nova.scheduler.client.report [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance 1c46d320-cd4f-40ea-ba30-d030e4b745b3
Feb 27 17:18:32 compute-0 nova_compute[186840]: 2026-02-27 17:18:32.726 186844 DEBUG oslo_concurrency.lockutils [None req-b3d9956a-bf10-43ac-93dc-d252f6d865d0 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.828s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:33 compute-0 nova_compute[186840]: 2026-02-27 17:18:33.630 186844 DEBUG nova.compute.manager [req-0f2a0a4c-4c8a-47d8-8ce4-d07868634a5b req-6be59998-8d10-4360-b809-1a79cc0c72c1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Received event network-vif-plugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:33 compute-0 nova_compute[186840]: 2026-02-27 17:18:33.631 186844 DEBUG oslo_concurrency.lockutils [req-0f2a0a4c-4c8a-47d8-8ce4-d07868634a5b req-6be59998-8d10-4360-b809-1a79cc0c72c1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:33 compute-0 nova_compute[186840]: 2026-02-27 17:18:33.631 186844 DEBUG oslo_concurrency.lockutils [req-0f2a0a4c-4c8a-47d8-8ce4-d07868634a5b req-6be59998-8d10-4360-b809-1a79cc0c72c1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:33 compute-0 nova_compute[186840]: 2026-02-27 17:18:33.632 186844 DEBUG oslo_concurrency.lockutils [req-0f2a0a4c-4c8a-47d8-8ce4-d07868634a5b req-6be59998-8d10-4360-b809-1a79cc0c72c1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "1c46d320-cd4f-40ea-ba30-d030e4b745b3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:33 compute-0 nova_compute[186840]: 2026-02-27 17:18:33.633 186844 DEBUG nova.compute.manager [req-0f2a0a4c-4c8a-47d8-8ce4-d07868634a5b req-6be59998-8d10-4360-b809-1a79cc0c72c1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] No waiting events found dispatching network-vif-plugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:18:33 compute-0 nova_compute[186840]: 2026-02-27 17:18:33.633 186844 WARNING nova.compute.manager [req-0f2a0a4c-4c8a-47d8-8ce4-d07868634a5b req-6be59998-8d10-4360-b809-1a79cc0c72c1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Received unexpected event network-vif-plugged-b4dd4d81-f47d-472f-bf94-5869a2eb2f67 for instance with vm_state deleted and task_state None.
Feb 27 17:18:34 compute-0 nova_compute[186840]: 2026-02-27 17:18:34.987 186844 DEBUG oslo_concurrency.lockutils [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "interface-f087df93-6b03-417d-bc8b-7114adfa61a4-3e6806d8-8dee-4392-befe-ef55f59117ce" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:34 compute-0 nova_compute[186840]: 2026-02-27 17:18:34.987 186844 DEBUG oslo_concurrency.lockutils [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "interface-f087df93-6b03-417d-bc8b-7114adfa61a4-3e6806d8-8dee-4392-befe-ef55f59117ce" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.010 186844 DEBUG nova.objects.instance [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'flavor' on Instance uuid f087df93-6b03-417d-bc8b-7114adfa61a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.048 186844 DEBUG nova.virt.libvirt.vif [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:17:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1831329056',display_name='tempest-TestNetworkBasicOps-server-1831329056',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1831329056',id=6,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIaVKLnwz32T5jDpyo7WEJrlvskZslAI5/7NGxUJivyVhVGtFkAYnU35V97Oz4Wgiv2ux6ErJ2dANrk8vgbnGnUPzSF4PSRLYk7XU+cGTBsuuaM3cDuxAsl3jR6sor7og==',key_name='tempest-TestNetworkBasicOps-1101354959',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:17:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-6lfcp046',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:17:14Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f087df93-6b03-417d-bc8b-7114adfa61a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.049 186844 DEBUG nova.network.os_vif_util [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.050 186844 DEBUG nova.network.os_vif_util [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.053 186844 DEBUG nova.virt.libvirt.guest [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:63:c2:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e6806d8-8d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.056 186844 DEBUG nova.virt.libvirt.guest [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:63:c2:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e6806d8-8d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.060 186844 DEBUG nova.virt.libvirt.driver [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Attempting to detach device tap3e6806d8-8d from instance f087df93-6b03-417d-bc8b-7114adfa61a4 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.061 186844 DEBUG nova.virt.libvirt.guest [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] detach device xml: <interface type="ethernet">
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <mac address="fa:16:3e:63:c2:e1"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <model type="virtio"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <mtu size="1442"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <target dev="tap3e6806d8-8d"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]: </interface>
Feb 27 17:18:35 compute-0 nova_compute[186840]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.068 186844 DEBUG nova.virt.libvirt.guest [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:63:c2:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e6806d8-8d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.072 186844 DEBUG nova.virt.libvirt.guest [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:63:c2:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e6806d8-8d"/></interface>not found in domain: <domain type='kvm' id='6'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <name>instance-00000006</name>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <uuid>f087df93-6b03-417d-bc8b-7114adfa61a4</uuid>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1831329056</nova:name>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:17:42</nova:creationTime>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:port uuid="ddbe59f7-465a-458f-a721-e3d5d380e6cc">
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:port uuid="3e6806d8-8dee-4392-befe-ef55f59117ce">
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:18:35 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <memory unit='KiB'>131072</memory>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <vcpu placement='static'>1</vcpu>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <resource>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <partition>/machine</partition>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </resource>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <sysinfo type='smbios'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <system>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='manufacturer'>RDO</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='product'>OpenStack Compute</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='version'>27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='serial'>f087df93-6b03-417d-bc8b-7114adfa61a4</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='uuid'>f087df93-6b03-417d-bc8b-7114adfa61a4</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='family'>Virtual Machine</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </system>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <os>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <boot dev='hd'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <smbios mode='sysinfo'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </os>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <features>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <vmcoreinfo state='on'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </features>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <cpu mode='custom' match='exact' check='full'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <vendor>AMD</vendor>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='x2apic'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc-deadline'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='hypervisor'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc_adjust'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='spec-ctrl'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='stibp'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='ssbd'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='cmp_legacy'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='overflow-recov'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='succor'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='ibrs'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='amd-ssbd'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='virt-ssbd'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='lbrv'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='tsc-scale'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='vmcb-clean'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='flushbyasid'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='pause-filter'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='pfthreshold'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='xsaves'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='svm'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='topoext'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='npt'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='nrip-save'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <clock offset='utc'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <timer name='pit' tickpolicy='delay'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <timer name='hpet' present='no'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <on_poweroff>destroy</on_poweroff>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <on_reboot>restart</on_reboot>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <on_crash>destroy</on_crash>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <disk type='file' device='disk'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk' index='2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <backingStore type='file' index='3'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:         <format type='raw'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:         <source file='/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:         <backingStore/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       </backingStore>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target dev='vda' bus='virtio'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='virtio-disk0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <disk type='file' device='cdrom'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <driver name='qemu' type='raw' cache='none'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk.config' index='1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <backingStore/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target dev='sda' bus='sata'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <readonly/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='sata0-0-0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='0' model='pcie-root'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pcie.0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='1' port='0x10'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='2' port='0x11'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='3' port='0x12'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.3'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='4' port='0x13'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.4'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='5' port='0x14'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.5'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='6' port='0x15'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.6'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='7' port='0x16'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.7'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='8' port='0x17'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.8'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='9' port='0x18'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.9'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='10' port='0x19'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.10'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='11' port='0x1a'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.11'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='12' port='0x1b'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.12'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='13' port='0x1c'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.13'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='14' port='0x1d'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.14'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='15' port='0x1e'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.15'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='16' port='0x1f'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.16'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='17' port='0x20'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.17'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='18' port='0x21'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.18'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='19' port='0x22'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.19'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='20' port='0x23'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.20'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='21' port='0x24'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.21'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='22' port='0x25'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.22'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='23' port='0x26'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.23'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='24' port='0x27'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.24'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='25' port='0x28'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.25'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-pci-bridge'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.26'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='usb'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='sata' index='0'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='ide'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <interface type='ethernet'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <mac address='fa:16:3e:c8:ca:23'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target dev='tapddbe59f7-46'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model type='virtio'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <driver name='vhost' rx_queue_size='512'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <mtu size='1442'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='net0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <interface type='ethernet'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <mac address='fa:16:3e:63:c2:e1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target dev='tap3e6806d8-8d'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model type='virtio'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <driver name='vhost' rx_queue_size='512'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <mtu size='1442'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='net1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <serial type='pty'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/console.log' append='off'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target type='isa-serial' port='0'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:         <model name='isa-serial'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       </target>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <console type='pty' tty='/dev/pts/0'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/console.log' append='off'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target type='serial' port='0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </console>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <input type='tablet' bus='usb'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='input0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='usb' bus='0' port='1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <input type='mouse' bus='ps2'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='input1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <input type='keyboard' bus='ps2'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='input2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <listen type='address' address='::0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <audio id='1' type='none'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <video>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model type='virtio' heads='1' primary='yes'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='video0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </video>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <watchdog model='itco' action='reset'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='watchdog0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </watchdog>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <memballoon model='virtio'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <stats period='10'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='balloon0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <rng model='virtio'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <backend model='random'>/dev/urandom</backend>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='rng0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <label>system_u:system_r:svirt_t:s0:c239,c902</label>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c239,c902</imagelabel>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <label>+107:+107</label>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <imagelabel>+107:+107</imagelabel>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:18:35 compute-0 nova_compute[186840]: </domain>
Feb 27 17:18:35 compute-0 nova_compute[186840]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.073 186844 INFO nova.virt.libvirt.driver [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully detached device tap3e6806d8-8d from instance f087df93-6b03-417d-bc8b-7114adfa61a4 from the persistent domain config.
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.074 186844 DEBUG nova.virt.libvirt.driver [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] (1/8): Attempting to detach device tap3e6806d8-8d with device alias net1 from instance f087df93-6b03-417d-bc8b-7114adfa61a4 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.074 186844 DEBUG nova.virt.libvirt.guest [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] detach device xml: <interface type="ethernet">
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <mac address="fa:16:3e:63:c2:e1"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <model type="virtio"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <mtu size="1442"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <target dev="tap3e6806d8-8d"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]: </interface>
Feb 27 17:18:35 compute-0 nova_compute[186840]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Feb 27 17:18:35 compute-0 kernel: tap3e6806d8-8d (unregistering): left promiscuous mode
Feb 27 17:18:35 compute-0 NetworkManager[56537]: <info>  [1772212715.1859] device (tap3e6806d8-8d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.188 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:35 compute-0 ovn_controller[96756]: 2026-02-27T17:18:35Z|00101|binding|INFO|Releasing lport 3e6806d8-8dee-4392-befe-ef55f59117ce from this chassis (sb_readonly=0)
Feb 27 17:18:35 compute-0 ovn_controller[96756]: 2026-02-27T17:18:35Z|00102|binding|INFO|Setting lport 3e6806d8-8dee-4392-befe-ef55f59117ce down in Southbound
Feb 27 17:18:35 compute-0 ovn_controller[96756]: 2026-02-27T17:18:35Z|00103|binding|INFO|Removing iface tap3e6806d8-8d ovn-installed in OVS
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.194 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.196 186844 DEBUG nova.virt.libvirt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Received event <DeviceRemovedEvent: 1772212715.1962874, f087df93-6b03-417d-bc8b-7114adfa61a4 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.199 186844 DEBUG nova.virt.libvirt.driver [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Start waiting for the detach event from libvirt for device tap3e6806d8-8d with device alias net1 for instance f087df93-6b03-417d-bc8b-7114adfa61a4 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.200 186844 DEBUG nova.virt.libvirt.guest [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:63:c2:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e6806d8-8d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.201 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:c2:e1 10.100.0.19', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28', 'neutron:device_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-def28598-31df-42ab-92d1-b43c240c6127', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc893a01-f8c4-4bb0-8d1a-3097917beb15, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=3e6806d8-8dee-4392-befe-ef55f59117ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.204 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 3e6806d8-8dee-4392-befe-ef55f59117ce in datapath def28598-31df-42ab-92d1-b43c240c6127 unbound from our chassis
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.205 186844 DEBUG nova.virt.libvirt.guest [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:63:c2:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e6806d8-8d"/></interface>not found in domain: <domain type='kvm' id='6'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <name>instance-00000006</name>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <uuid>f087df93-6b03-417d-bc8b-7114adfa61a4</uuid>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1831329056</nova:name>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:17:42</nova:creationTime>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:port uuid="ddbe59f7-465a-458f-a721-e3d5d380e6cc">
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:port uuid="3e6806d8-8dee-4392-befe-ef55f59117ce">
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.19" ipVersion="4"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:18:35 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <memory unit='KiB'>131072</memory>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <vcpu placement='static'>1</vcpu>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <resource>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <partition>/machine</partition>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </resource>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <sysinfo type='smbios'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <system>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='manufacturer'>RDO</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='product'>OpenStack Compute</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='version'>27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='serial'>f087df93-6b03-417d-bc8b-7114adfa61a4</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='uuid'>f087df93-6b03-417d-bc8b-7114adfa61a4</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <entry name='family'>Virtual Machine</entry>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </system>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <os>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <boot dev='hd'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <smbios mode='sysinfo'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </os>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <features>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <vmcoreinfo state='on'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </features>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <cpu mode='custom' match='exact' check='full'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <vendor>AMD</vendor>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='x2apic'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc-deadline'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='hypervisor'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc_adjust'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='spec-ctrl'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='stibp'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='ssbd'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='cmp_legacy'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='overflow-recov'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='succor'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='ibrs'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='amd-ssbd'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='virt-ssbd'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='lbrv'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='tsc-scale'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='vmcb-clean'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='flushbyasid'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='pause-filter'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='pfthreshold'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='xsaves'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='svm'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='require' name='topoext'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='npt'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <feature policy='disable' name='nrip-save'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <clock offset='utc'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <timer name='pit' tickpolicy='delay'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <timer name='hpet' present='no'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <on_poweroff>destroy</on_poweroff>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <on_reboot>restart</on_reboot>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <on_crash>destroy</on_crash>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <disk type='file' device='disk'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk' index='2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <backingStore type='file' index='3'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:         <format type='raw'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:         <source file='/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:         <backingStore/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       </backingStore>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target dev='vda' bus='virtio'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='virtio-disk0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <disk type='file' device='cdrom'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <driver name='qemu' type='raw' cache='none'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk.config' index='1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <backingStore/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target dev='sda' bus='sata'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <readonly/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='sata0-0-0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='0' model='pcie-root'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pcie.0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='1' port='0x10'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='2' port='0x11'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='3' port='0x12'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.3'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='4' port='0x13'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.4'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='5' port='0x14'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.5'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='6' port='0x15'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.6'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='7' port='0x16'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.7'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='8' port='0x17'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.8'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='9' port='0x18'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.9'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='10' port='0x19'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.10'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='11' port='0x1a'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.11'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='12' port='0x1b'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.12'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='13' port='0x1c'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.13'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='14' port='0x1d'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.14'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='15' port='0x1e'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.15'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='16' port='0x1f'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.16'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='17' port='0x20'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.17'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='18' port='0x21'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.18'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='19' port='0x22'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.19'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='20' port='0x23'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.20'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='21' port='0x24'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.21'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='22' port='0x25'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.22'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='23' port='0x26'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.23'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='24' port='0x27'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.24'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target chassis='25' port='0x28'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.25'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model name='pcie-pci-bridge'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='pci.26'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='usb'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <controller type='sata' index='0'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='ide'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <interface type='ethernet'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <mac address='fa:16:3e:c8:ca:23'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target dev='tapddbe59f7-46'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model type='virtio'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <driver name='vhost' rx_queue_size='512'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <mtu size='1442'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='net0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <serial type='pty'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/console.log' append='off'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target type='isa-serial' port='0'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:         <model name='isa-serial'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       </target>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <console type='pty' tty='/dev/pts/0'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/console.log' append='off'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <target type='serial' port='0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.207 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network def28598-31df-42ab-92d1-b43c240c6127, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </console>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <input type='tablet' bus='usb'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='input0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='usb' bus='0' port='1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <input type='mouse' bus='ps2'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='input1'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <input type='keyboard' bus='ps2'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='input2'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <listen type='address' address='::0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <audio id='1' type='none'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <video>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <model type='virtio' heads='1' primary='yes'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='video0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </video>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <watchdog model='itco' action='reset'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='watchdog0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </watchdog>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <memballoon model='virtio'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <stats period='10'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='balloon0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <rng model='virtio'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <backend model='random'>/dev/urandom</backend>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <alias name='rng0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <label>system_u:system_r:svirt_t:s0:c239,c902</label>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c239,c902</imagelabel>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <label>+107:+107</label>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <imagelabel>+107:+107</imagelabel>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:18:35 compute-0 nova_compute[186840]: </domain>
Feb 27 17:18:35 compute-0 nova_compute[186840]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.206 186844 INFO nova.virt.libvirt.driver [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully detached device tap3e6806d8-8d from instance f087df93-6b03-417d-bc8b-7114adfa61a4 from the live domain config.
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.207 186844 DEBUG nova.virt.libvirt.vif [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:17:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1831329056',display_name='tempest-TestNetworkBasicOps-server-1831329056',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1831329056',id=6,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIaVKLnwz32T5jDpyo7WEJrlvskZslAI5/7NGxUJivyVhVGtFkAYnU35V97Oz4Wgiv2ux6ErJ2dANrk8vgbnGnUPzSF4PSRLYk7XU+cGTBsuuaM3cDuxAsl3jR6sor7og==',key_name='tempest-TestNetworkBasicOps-1101354959',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:17:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-6lfcp046',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:17:14Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f087df93-6b03-417d-bc8b-7114adfa61a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.207 186844 DEBUG nova.network.os_vif_util [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.210 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[82a47f62-0c58-4ab5-94cb-1d32686a56c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.211 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-def28598-31df-42ab-92d1-b43c240c6127 namespace which is not needed anymore
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.211 186844 DEBUG nova.network.os_vif_util [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.212 186844 DEBUG os_vif [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.214 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.215 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e6806d8-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.217 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.219 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.223 186844 INFO os_vif [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d')
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.225 186844 DEBUG nova.virt.libvirt.guest [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1831329056</nova:name>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:18:35</nova:creationTime>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     <nova:port uuid="ddbe59f7-465a-458f-a721-e3d5d380e6cc">
Feb 27 17:18:35 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 27 17:18:35 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:18:35 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:18:35 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:18:35 compute-0 nova_compute[186840]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 27 17:18:35 compute-0 neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127[218512]: [NOTICE]   (218516) : haproxy version is 2.8.14-c23fe91
Feb 27 17:18:35 compute-0 neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127[218512]: [NOTICE]   (218516) : path to executable is /usr/sbin/haproxy
Feb 27 17:18:35 compute-0 neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127[218512]: [WARNING]  (218516) : Exiting Master process...
Feb 27 17:18:35 compute-0 neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127[218512]: [ALERT]    (218516) : Current worker (218518) exited with code 143 (Terminated)
Feb 27 17:18:35 compute-0 neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127[218512]: [WARNING]  (218516) : All workers exited. Exiting... (0)
Feb 27 17:18:35 compute-0 systemd[1]: libpod-4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359.scope: Deactivated successfully.
Feb 27 17:18:35 compute-0 podman[218856]: 2026-02-27 17:18:35.374657941 +0000 UTC m=+0.061368803 container died 4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 27 17:18:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359-userdata-shm.mount: Deactivated successfully.
Feb 27 17:18:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-1b06f2b36ded53f41876e57001154054c4a9affbbcc39ecf2a2e76c9ceaad093-merged.mount: Deactivated successfully.
Feb 27 17:18:35 compute-0 podman[218856]: 2026-02-27 17:18:35.411706825 +0000 UTC m=+0.098417697 container cleanup 4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 27 17:18:35 compute-0 systemd[1]: libpod-conmon-4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359.scope: Deactivated successfully.
Feb 27 17:18:35 compute-0 podman[218886]: 2026-02-27 17:18:35.489120997 +0000 UTC m=+0.052932882 container remove 4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.496 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[01104217-e421-4180-9971-601213f18a01]: (4, ('Fri Feb 27 05:18:35 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127 (4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359)\n4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359\nFri Feb 27 05:18:35 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-def28598-31df-42ab-92d1-b43c240c6127 (4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359)\n4186a4efe62dc2ea466c00c5eb72cfb3d709a70ec69a225308374e22b8007359\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.498 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[213558de-6ed1-4d07-b37a-47579f59669a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.499 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdef28598-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.501 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:35 compute-0 kernel: tapdef28598-30: left promiscuous mode
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.509 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.514 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[10f60be7-0405-4098-82ef-1abbc6dddf4d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.529 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[7e39af5c-dd58-4e1a-90d8-7088dd887012]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.530 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[ba1ab20f-331e-4993-8e0e-3b74d67f0abc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.540 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.550 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[2db3762c-4092-4ab0-a2a6-17086906c1cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 353823, 'reachable_time': 23500, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218902, 'error': None, 'target': 'ovnmeta-def28598-31df-42ab-92d1-b43c240c6127', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.553 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-def28598-31df-42ab-92d1-b43c240c6127 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:18:35 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:35.553 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[e0a4fec3-9bc7-4e9c-adc6-888e3e290877]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:35 compute-0 systemd[1]: run-netns-ovnmeta\x2ddef28598\x2d31df\x2d42ab\x2d92d1\x2db43c240c6127.mount: Deactivated successfully.
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.947 186844 DEBUG nova.compute.manager [req-6620a164-7a58-45c3-b37a-2e4b80586496 req-119883ff-c5cf-4d64-8c2d-d22fd6433ffa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-vif-unplugged-3e6806d8-8dee-4392-befe-ef55f59117ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.947 186844 DEBUG oslo_concurrency.lockutils [req-6620a164-7a58-45c3-b37a-2e4b80586496 req-119883ff-c5cf-4d64-8c2d-d22fd6433ffa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.947 186844 DEBUG oslo_concurrency.lockutils [req-6620a164-7a58-45c3-b37a-2e4b80586496 req-119883ff-c5cf-4d64-8c2d-d22fd6433ffa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.948 186844 DEBUG oslo_concurrency.lockutils [req-6620a164-7a58-45c3-b37a-2e4b80586496 req-119883ff-c5cf-4d64-8c2d-d22fd6433ffa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.948 186844 DEBUG nova.compute.manager [req-6620a164-7a58-45c3-b37a-2e4b80586496 req-119883ff-c5cf-4d64-8c2d-d22fd6433ffa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] No waiting events found dispatching network-vif-unplugged-3e6806d8-8dee-4392-befe-ef55f59117ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:18:35 compute-0 nova_compute[186840]: 2026-02-27 17:18:35.948 186844 WARNING nova.compute.manager [req-6620a164-7a58-45c3-b37a-2e4b80586496 req-119883ff-c5cf-4d64-8c2d-d22fd6433ffa 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received unexpected event network-vif-unplugged-3e6806d8-8dee-4392-befe-ef55f59117ce for instance with vm_state active and task_state None.
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.194 186844 DEBUG oslo_concurrency.lockutils [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.195 186844 DEBUG oslo_concurrency.lockutils [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.195 186844 DEBUG nova.network.neutron [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.257 186844 DEBUG nova.compute.manager [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-vif-deleted-3e6806d8-8dee-4392-befe-ef55f59117ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.258 186844 INFO nova.compute.manager [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Neutron deleted interface 3e6806d8-8dee-4392-befe-ef55f59117ce; detaching it from the instance and deleting it from the info cache
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.258 186844 DEBUG nova.network.neutron [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updating instance_info_cache with network_info: [{"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.284 186844 DEBUG nova.objects.instance [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lazy-loading 'system_metadata' on Instance uuid f087df93-6b03-417d-bc8b-7114adfa61a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.312 186844 DEBUG nova.objects.instance [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lazy-loading 'flavor' on Instance uuid f087df93-6b03-417d-bc8b-7114adfa61a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.334 186844 DEBUG nova.virt.libvirt.vif [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:17:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1831329056',display_name='tempest-TestNetworkBasicOps-server-1831329056',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1831329056',id=6,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIaVKLnwz32T5jDpyo7WEJrlvskZslAI5/7NGxUJivyVhVGtFkAYnU35V97Oz4Wgiv2ux6ErJ2dANrk8vgbnGnUPzSF4PSRLYk7XU+cGTBsuuaM3cDuxAsl3jR6sor7og==',key_name='tempest-TestNetworkBasicOps-1101354959',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:17:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-6lfcp046',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:17:14Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f087df93-6b03-417d-bc8b-7114adfa61a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.335 186844 DEBUG nova.network.os_vif_util [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Converting VIF {"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.335 186844 DEBUG nova.network.os_vif_util [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.339 186844 DEBUG nova.virt.libvirt.guest [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:63:c2:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e6806d8-8d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.343 186844 DEBUG nova.virt.libvirt.guest [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:63:c2:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e6806d8-8d"/></interface>not found in domain: <domain type='kvm' id='6'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <name>instance-00000006</name>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <uuid>f087df93-6b03-417d-bc8b-7114adfa61a4</uuid>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1831329056</nova:name>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:18:35</nova:creationTime>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:port uuid="ddbe59f7-465a-458f-a721-e3d5d380e6cc">
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:18:36 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <memory unit='KiB'>131072</memory>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <vcpu placement='static'>1</vcpu>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <resource>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <partition>/machine</partition>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </resource>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <sysinfo type='smbios'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <system>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='manufacturer'>RDO</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='product'>OpenStack Compute</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='version'>27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='serial'>f087df93-6b03-417d-bc8b-7114adfa61a4</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='uuid'>f087df93-6b03-417d-bc8b-7114adfa61a4</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='family'>Virtual Machine</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </system>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <os>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <boot dev='hd'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <smbios mode='sysinfo'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </os>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <features>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <vmcoreinfo state='on'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </features>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <cpu mode='custom' match='exact' check='full'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <vendor>AMD</vendor>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='x2apic'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc-deadline'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='hypervisor'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc_adjust'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='spec-ctrl'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='stibp'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='ssbd'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='cmp_legacy'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='overflow-recov'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='succor'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='ibrs'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='amd-ssbd'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='virt-ssbd'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='lbrv'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='tsc-scale'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='vmcb-clean'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='flushbyasid'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='pause-filter'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='pfthreshold'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='xsaves'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='svm'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='topoext'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='npt'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='nrip-save'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <clock offset='utc'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <timer name='pit' tickpolicy='delay'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <timer name='hpet' present='no'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <on_poweroff>destroy</on_poweroff>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <on_reboot>restart</on_reboot>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <on_crash>destroy</on_crash>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <disk type='file' device='disk'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk' index='2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <backingStore type='file' index='3'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:         <format type='raw'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:         <source file='/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:         <backingStore/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       </backingStore>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target dev='vda' bus='virtio'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='virtio-disk0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <disk type='file' device='cdrom'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <driver name='qemu' type='raw' cache='none'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk.config' index='1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <backingStore/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target dev='sda' bus='sata'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <readonly/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='sata0-0-0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='0' model='pcie-root'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pcie.0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='1' port='0x10'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='2' port='0x11'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='3' port='0x12'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.3'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='4' port='0x13'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.4'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='5' port='0x14'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.5'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='6' port='0x15'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.6'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='7' port='0x16'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.7'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='8' port='0x17'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.8'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='9' port='0x18'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.9'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='10' port='0x19'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.10'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='11' port='0x1a'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.11'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='12' port='0x1b'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.12'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='13' port='0x1c'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.13'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='14' port='0x1d'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.14'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='15' port='0x1e'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.15'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='16' port='0x1f'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.16'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='17' port='0x20'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.17'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='18' port='0x21'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.18'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='19' port='0x22'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.19'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='20' port='0x23'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.20'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='21' port='0x24'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.21'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='22' port='0x25'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.22'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='23' port='0x26'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.23'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='24' port='0x27'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.24'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='25' port='0x28'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.25'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-pci-bridge'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.26'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='usb'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='sata' index='0'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='ide'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <interface type='ethernet'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <mac address='fa:16:3e:c8:ca:23'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target dev='tapddbe59f7-46'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model type='virtio'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <driver name='vhost' rx_queue_size='512'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <mtu size='1442'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='net0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <serial type='pty'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/console.log' append='off'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target type='isa-serial' port='0'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:         <model name='isa-serial'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       </target>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <console type='pty' tty='/dev/pts/0'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/console.log' append='off'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target type='serial' port='0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </console>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <input type='tablet' bus='usb'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='input0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='usb' bus='0' port='1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <input type='mouse' bus='ps2'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='input1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <input type='keyboard' bus='ps2'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='input2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <listen type='address' address='::0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <audio id='1' type='none'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <video>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model type='virtio' heads='1' primary='yes'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='video0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </video>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <watchdog model='itco' action='reset'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='watchdog0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </watchdog>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <memballoon model='virtio'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <stats period='10'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='balloon0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <rng model='virtio'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <backend model='random'>/dev/urandom</backend>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='rng0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <label>system_u:system_r:svirt_t:s0:c239,c902</label>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c239,c902</imagelabel>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <label>+107:+107</label>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <imagelabel>+107:+107</imagelabel>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:18:36 compute-0 nova_compute[186840]: </domain>
Feb 27 17:18:36 compute-0 nova_compute[186840]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.344 186844 DEBUG nova.virt.libvirt.guest [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:63:c2:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e6806d8-8d"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.348 186844 DEBUG nova.virt.libvirt.guest [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:63:c2:e1"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap3e6806d8-8d"/></interface>not found in domain: <domain type='kvm' id='6'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <name>instance-00000006</name>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <uuid>f087df93-6b03-417d-bc8b-7114adfa61a4</uuid>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1831329056</nova:name>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:18:35</nova:creationTime>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:port uuid="ddbe59f7-465a-458f-a721-e3d5d380e6cc">
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:18:36 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <memory unit='KiB'>131072</memory>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <currentMemory unit='KiB'>131072</currentMemory>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <vcpu placement='static'>1</vcpu>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <resource>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <partition>/machine</partition>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </resource>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <sysinfo type='smbios'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <system>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='manufacturer'>RDO</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='product'>OpenStack Compute</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='version'>27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='serial'>f087df93-6b03-417d-bc8b-7114adfa61a4</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='uuid'>f087df93-6b03-417d-bc8b-7114adfa61a4</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <entry name='family'>Virtual Machine</entry>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </system>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <os>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <boot dev='hd'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <smbios mode='sysinfo'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </os>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <features>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <vmcoreinfo state='on'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </features>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <cpu mode='custom' match='exact' check='full'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <model fallback='forbid'>EPYC-Rome</model>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <vendor>AMD</vendor>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='x2apic'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc-deadline'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='hypervisor'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='tsc_adjust'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='spec-ctrl'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='stibp'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='ssbd'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='cmp_legacy'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='overflow-recov'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='succor'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='ibrs'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='amd-ssbd'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='virt-ssbd'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='lbrv'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='tsc-scale'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='vmcb-clean'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='flushbyasid'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='pause-filter'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='pfthreshold'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='svme-addr-chk'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='lfence-always-serializing'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='xsaves'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='svm'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='require' name='topoext'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='npt'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <feature policy='disable' name='nrip-save'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <clock offset='utc'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <timer name='pit' tickpolicy='delay'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <timer name='rtc' tickpolicy='catchup'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <timer name='hpet' present='no'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <on_poweroff>destroy</on_poweroff>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <on_reboot>restart</on_reboot>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <on_crash>destroy</on_crash>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <disk type='file' device='disk'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <driver name='qemu' type='qcow2' cache='none'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk' index='2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <backingStore type='file' index='3'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:         <format type='raw'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:         <source file='/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:         <backingStore/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       </backingStore>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target dev='vda' bus='virtio'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='virtio-disk0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <disk type='file' device='cdrom'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <driver name='qemu' type='raw' cache='none'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <source file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/disk.config' index='1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <backingStore/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target dev='sda' bus='sata'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <readonly/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='sata0-0-0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='0' model='pcie-root'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pcie.0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='1' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='1' port='0x10'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='2' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='2' port='0x11'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='3' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='3' port='0x12'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.3'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='4' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='4' port='0x13'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.4'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='5' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='5' port='0x14'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.5'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='6' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='6' port='0x15'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.6'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='7' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='7' port='0x16'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.7'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='8' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='8' port='0x17'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.8'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='9' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='9' port='0x18'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.9'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='10' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='10' port='0x19'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.10'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='11' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='11' port='0x1a'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.11'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='12' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='12' port='0x1b'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.12'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='13' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='13' port='0x1c'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.13'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='14' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='14' port='0x1d'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.14'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='15' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='15' port='0x1e'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.15'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='16' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='16' port='0x1f'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.16'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='17' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='17' port='0x20'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.17'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='18' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='18' port='0x21'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.18'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='19' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='19' port='0x22'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.19'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='20' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='20' port='0x23'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.20'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='21' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='21' port='0x24'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.21'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='22' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='22' port='0x25'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.22'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='23' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='23' port='0x26'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.23'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='24' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='24' port='0x27'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.24'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='25' model='pcie-root-port'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-root-port'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target chassis='25' port='0x28'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.25'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model name='pcie-pci-bridge'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='pci.26'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='usb' index='0' model='piix3-uhci'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='usb'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <controller type='sata' index='0'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='ide'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </controller>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <interface type='ethernet'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <mac address='fa:16:3e:c8:ca:23'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target dev='tapddbe59f7-46'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model type='virtio'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <driver name='vhost' rx_queue_size='512'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <mtu size='1442'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='net0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <serial type='pty'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/console.log' append='off'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target type='isa-serial' port='0'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:         <model name='isa-serial'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       </target>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <console type='pty' tty='/dev/pts/0'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <source path='/dev/pts/0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <log file='/var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4/console.log' append='off'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <target type='serial' port='0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='serial0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </console>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <input type='tablet' bus='usb'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='input0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='usb' bus='0' port='1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <input type='mouse' bus='ps2'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='input1'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <input type='keyboard' bus='ps2'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='input2'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </input>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <listen type='address' address='::0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </graphics>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <audio id='1' type='none'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <video>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <model type='virtio' heads='1' primary='yes'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='video0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </video>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <watchdog model='itco' action='reset'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='watchdog0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </watchdog>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <memballoon model='virtio'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <stats period='10'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='balloon0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <rng model='virtio'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <backend model='random'>/dev/urandom</backend>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <alias name='rng0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <label>system_u:system_r:svirt_t:s0:c239,c902</label>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c239,c902</imagelabel>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <label>+107:+107</label>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <imagelabel>+107:+107</imagelabel>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </seclabel>
Feb 27 17:18:36 compute-0 nova_compute[186840]: </domain>
Feb 27 17:18:36 compute-0 nova_compute[186840]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.350 186844 WARNING nova.virt.libvirt.driver [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Detaching interface fa:16:3e:63:c2:e1 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap3e6806d8-8d' not found.
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.350 186844 DEBUG nova.virt.libvirt.vif [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:17:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1831329056',display_name='tempest-TestNetworkBasicOps-server-1831329056',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1831329056',id=6,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIaVKLnwz32T5jDpyo7WEJrlvskZslAI5/7NGxUJivyVhVGtFkAYnU35V97Oz4Wgiv2ux6ErJ2dANrk8vgbnGnUPzSF4PSRLYk7XU+cGTBsuuaM3cDuxAsl3jR6sor7og==',key_name='tempest-TestNetworkBasicOps-1101354959',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:17:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-6lfcp046',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:17:14Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f087df93-6b03-417d-bc8b-7114adfa61a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.351 186844 DEBUG nova.network.os_vif_util [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Converting VIF {"id": "3e6806d8-8dee-4392-befe-ef55f59117ce", "address": "fa:16:3e:63:c2:e1", "network": {"id": "def28598-31df-42ab-92d1-b43c240c6127", "bridge": "br-int", "label": "tempest-network-smoke--849704008", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.19", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e6806d8-8d", "ovs_interfaceid": "3e6806d8-8dee-4392-befe-ef55f59117ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.351 186844 DEBUG nova.network.os_vif_util [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.352 186844 DEBUG os_vif [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.353 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.353 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e6806d8-8d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.353 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.355 186844 INFO os_vif [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:63:c2:e1,bridge_name='br-int',has_traffic_filtering=True,id=3e6806d8-8dee-4392-befe-ef55f59117ce,network=Network(def28598-31df-42ab-92d1-b43c240c6127),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e6806d8-8d')
Feb 27 17:18:36 compute-0 nova_compute[186840]: 2026-02-27 17:18:36.356 186844 DEBUG nova.virt.libvirt.guest [req-1614b847-ca72-4b62-a293-26e02a8684e5 req-90cffcb1-6ee4-4264-99e6-d3c728d7ca7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:name>tempest-TestNetworkBasicOps-server-1831329056</nova:name>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:creationTime>2026-02-27 17:18:36</nova:creationTime>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:flavor name="m1.nano">
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:memory>128</nova:memory>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:disk>1</nova:disk>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:swap>0</nova:swap>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:vcpus>1</nova:vcpus>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </nova:flavor>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:owner>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </nova:owner>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   <nova:ports>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     <nova:port uuid="ddbe59f7-465a-458f-a721-e3d5d380e6cc">
Feb 27 17:18:36 compute-0 nova_compute[186840]:       <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Feb 27 17:18:36 compute-0 nova_compute[186840]:     </nova:port>
Feb 27 17:18:36 compute-0 nova_compute[186840]:   </nova:ports>
Feb 27 17:18:36 compute-0 nova_compute[186840]: </nova:instance>
Feb 27 17:18:36 compute-0 nova_compute[186840]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Feb 27 17:18:37 compute-0 ovn_controller[96756]: 2026-02-27T17:18:37Z|00104|binding|INFO|Releasing lport 7974a05e-8c1c-4bb6-b06c-d51011d44f74 from this chassis (sb_readonly=0)
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.007 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.070 186844 DEBUG nova.compute.manager [req-b55c1659-0d45-4cf0-8f8f-353d2b402e0d req-2a02e844-48d9-4332-881f-1ccf0e4a52c2 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-vif-plugged-3e6806d8-8dee-4392-befe-ef55f59117ce external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.070 186844 DEBUG oslo_concurrency.lockutils [req-b55c1659-0d45-4cf0-8f8f-353d2b402e0d req-2a02e844-48d9-4332-881f-1ccf0e4a52c2 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.070 186844 DEBUG oslo_concurrency.lockutils [req-b55c1659-0d45-4cf0-8f8f-353d2b402e0d req-2a02e844-48d9-4332-881f-1ccf0e4a52c2 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.071 186844 DEBUG oslo_concurrency.lockutils [req-b55c1659-0d45-4cf0-8f8f-353d2b402e0d req-2a02e844-48d9-4332-881f-1ccf0e4a52c2 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.071 186844 DEBUG nova.compute.manager [req-b55c1659-0d45-4cf0-8f8f-353d2b402e0d req-2a02e844-48d9-4332-881f-1ccf0e4a52c2 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] No waiting events found dispatching network-vif-plugged-3e6806d8-8dee-4392-befe-ef55f59117ce pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.072 186844 WARNING nova.compute.manager [req-b55c1659-0d45-4cf0-8f8f-353d2b402e0d req-2a02e844-48d9-4332-881f-1ccf0e4a52c2 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received unexpected event network-vif-plugged-3e6806d8-8dee-4392-befe-ef55f59117ce for instance with vm_state active and task_state None.
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.430 186844 INFO nova.network.neutron [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Port 3e6806d8-8dee-4392-befe-ef55f59117ce from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.430 186844 DEBUG nova.network.neutron [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updating instance_info_cache with network_info: [{"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.446 186844 DEBUG oslo_concurrency.lockutils [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.467 186844 DEBUG oslo_concurrency.lockutils [None req-2cf12224-f264-4e3e-b8b1-85f1760dcbfc 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "interface-f087df93-6b03-417d-bc8b-7114adfa61a4-3e6806d8-8dee-4392-befe-ef55f59117ce" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.479s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.635 186844 DEBUG oslo_concurrency.lockutils [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.636 186844 DEBUG oslo_concurrency.lockutils [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.636 186844 DEBUG oslo_concurrency.lockutils [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.637 186844 DEBUG oslo_concurrency.lockutils [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.637 186844 DEBUG oslo_concurrency.lockutils [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.640 186844 INFO nova.compute.manager [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Terminating instance
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.641 186844 DEBUG nova.compute.manager [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:18:38 compute-0 kernel: tapddbe59f7-46 (unregistering): left promiscuous mode
Feb 27 17:18:38 compute-0 NetworkManager[56537]: <info>  [1772212718.6670] device (tapddbe59f7-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:18:38 compute-0 ovn_controller[96756]: 2026-02-27T17:18:38Z|00105|binding|INFO|Releasing lport ddbe59f7-465a-458f-a721-e3d5d380e6cc from this chassis (sb_readonly=0)
Feb 27 17:18:38 compute-0 ovn_controller[96756]: 2026-02-27T17:18:38Z|00106|binding|INFO|Setting lport ddbe59f7-465a-458f-a721-e3d5d380e6cc down in Southbound
Feb 27 17:18:38 compute-0 ovn_controller[96756]: 2026-02-27T17:18:38Z|00107|binding|INFO|Removing iface tapddbe59f7-46 ovn-installed in OVS
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.668 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:38.684 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c8:ca:23 10.100.0.6'], port_security=['fa:16:3e:c8:ca:23 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'f087df93-6b03-417d-bc8b-7114adfa61a4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8e14168a-35d3-4dd3-9225-5c6c14ef7d52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4694fc97-3ead-4f0e-a0fa-02f879d98eb1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=53a09580-e670-430e-8e67-1c6e90b35016, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=ddbe59f7-465a-458f-a721-e3d5d380e6cc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:18:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:38.686 106085 INFO neutron.agent.ovn.metadata.agent [-] Port ddbe59f7-465a-458f-a721-e3d5d380e6cc in datapath 8e14168a-35d3-4dd3-9225-5c6c14ef7d52 unbound from our chassis
Feb 27 17:18:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:38.687 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8e14168a-35d3-4dd3-9225-5c6c14ef7d52, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:18:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:38.688 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[e6e5e76d-5fa2-4b01-9923-03e1ddc929cc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:38.690 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52 namespace which is not needed anymore
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.693 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:38 compute-0 podman[218903]: 2026-02-27 17:18:38.696560569 +0000 UTC m=+0.091468193 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:18:38 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Deactivated successfully.
Feb 27 17:18:38 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000006.scope: Consumed 15.074s CPU time.
Feb 27 17:18:38 compute-0 systemd-machined[156136]: Machine qemu-6-instance-00000006 terminated.
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.864 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.868 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.895 186844 INFO nova.virt.libvirt.driver [-] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Instance destroyed successfully.
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.896 186844 DEBUG nova.objects.instance [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid f087df93-6b03-417d-bc8b-7114adfa61a4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.908 186844 DEBUG nova.virt.libvirt.vif [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:17:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1831329056',display_name='tempest-TestNetworkBasicOps-server-1831329056',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1831329056',id=6,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMIaVKLnwz32T5jDpyo7WEJrlvskZslAI5/7NGxUJivyVhVGtFkAYnU35V97Oz4Wgiv2ux6ErJ2dANrk8vgbnGnUPzSF4PSRLYk7XU+cGTBsuuaM3cDuxAsl3jR6sor7og==',key_name='tempest-TestNetworkBasicOps-1101354959',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:17:14Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-6lfcp046',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:17:14Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f087df93-6b03-417d-bc8b-7114adfa61a4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.909 186844 DEBUG nova.network.os_vif_util [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "address": "fa:16:3e:c8:ca:23", "network": {"id": "8e14168a-35d3-4dd3-9225-5c6c14ef7d52", "bridge": "br-int", "label": "tempest-network-smoke--1138499201", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.229", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddbe59f7-46", "ovs_interfaceid": "ddbe59f7-465a-458f-a721-e3d5d380e6cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.909 186844 DEBUG nova.network.os_vif_util [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c8:ca:23,bridge_name='br-int',has_traffic_filtering=True,id=ddbe59f7-465a-458f-a721-e3d5d380e6cc,network=Network(8e14168a-35d3-4dd3-9225-5c6c14ef7d52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddbe59f7-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.910 186844 DEBUG os_vif [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:ca:23,bridge_name='br-int',has_traffic_filtering=True,id=ddbe59f7-465a-458f-a721-e3d5d380e6cc,network=Network(8e14168a-35d3-4dd3-9225-5c6c14ef7d52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddbe59f7-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.911 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.911 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapddbe59f7-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.913 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.915 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.918 186844 INFO os_vif [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c8:ca:23,bridge_name='br-int',has_traffic_filtering=True,id=ddbe59f7-465a-458f-a721-e3d5d380e6cc,network=Network(8e14168a-35d3-4dd3-9225-5c6c14ef7d52),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddbe59f7-46')
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.918 186844 INFO nova.virt.libvirt.driver [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Deleting instance files /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4_del
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.919 186844 INFO nova.virt.libvirt.driver [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Deletion of /var/lib/nova/instances/f087df93-6b03-417d-bc8b-7114adfa61a4_del complete
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.975 186844 INFO nova.compute.manager [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.976 186844 DEBUG oslo.service.loopingcall [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.977 186844 DEBUG nova.compute.manager [-] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:18:38 compute-0 nova_compute[186840]: 2026-02-27 17:18:38.977 186844 DEBUG nova.network.neutron [-] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:18:39 compute-0 neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52[218251]: [NOTICE]   (218257) : haproxy version is 2.8.14-c23fe91
Feb 27 17:18:39 compute-0 neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52[218251]: [NOTICE]   (218257) : path to executable is /usr/sbin/haproxy
Feb 27 17:18:39 compute-0 neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52[218251]: [WARNING]  (218257) : Exiting Master process...
Feb 27 17:18:39 compute-0 neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52[218251]: [WARNING]  (218257) : Exiting Master process...
Feb 27 17:18:39 compute-0 neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52[218251]: [ALERT]    (218257) : Current worker (218259) exited with code 143 (Terminated)
Feb 27 17:18:39 compute-0 neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52[218251]: [WARNING]  (218257) : All workers exited. Exiting... (0)
Feb 27 17:18:39 compute-0 systemd[1]: libpod-4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99.scope: Deactivated successfully.
Feb 27 17:18:39 compute-0 podman[218952]: 2026-02-27 17:18:39.114545251 +0000 UTC m=+0.306966732 container died 4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 27 17:18:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99-userdata-shm.mount: Deactivated successfully.
Feb 27 17:18:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-62c9773153dba8e633a74dd8943e14111a074a4c0019933a5ed1cc7c229259e6-merged.mount: Deactivated successfully.
Feb 27 17:18:39 compute-0 podman[218952]: 2026-02-27 17:18:39.634134237 +0000 UTC m=+0.826555718 container cleanup 4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 27 17:18:39 compute-0 systemd[1]: libpod-conmon-4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99.scope: Deactivated successfully.
Feb 27 17:18:39 compute-0 nova_compute[186840]: 2026-02-27 17:18:39.836 186844 DEBUG nova.network.neutron [-] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:18:39 compute-0 nova_compute[186840]: 2026-02-27 17:18:39.874 186844 INFO nova.compute.manager [-] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Took 0.90 seconds to deallocate network for instance.
Feb 27 17:18:39 compute-0 nova_compute[186840]: 2026-02-27 17:18:39.944 186844 DEBUG oslo_concurrency.lockutils [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:39 compute-0 nova_compute[186840]: 2026-02-27 17:18:39.945 186844 DEBUG oslo_concurrency.lockutils [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.014 186844 DEBUG nova.compute.provider_tree [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.039 186844 DEBUG nova.scheduler.client.report [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.083 186844 DEBUG oslo_concurrency.lockutils [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.139 186844 INFO nova.scheduler.client.report [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance f087df93-6b03-417d-bc8b-7114adfa61a4
Feb 27 17:18:40 compute-0 podman[218999]: 2026-02-27 17:18:40.197444605 +0000 UTC m=+0.538381297 container remove 4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 27 17:18:40 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:40.202 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[e000f910-45c7-43fd-abc0-cc5294b88eaa]: (4, ('Fri Feb 27 05:18:38 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52 (4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99)\n4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99\nFri Feb 27 05:18:39 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52 (4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99)\n4ca80535eb1ce4fb6469606b25b82a4960fd47b94651bd086e28dbc9777c2a99\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:40 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:40.205 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[28974d2d-2752-4474-8780-9523156d0776]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:40 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:40.206 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8e14168a-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.209 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:40 compute-0 kernel: tap8e14168a-30: left promiscuous mode
Feb 27 17:18:40 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:40.214 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[01391055-c07b-42c6-9e08-4a768ce4b42d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.217 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:40 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:40.224 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[976e5099-ed0b-4f5c-befc-9aea351d1fd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:40 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:40.226 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[248cb6dd-cbe1-4a14-88f0-672dde70e164]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.231 186844 DEBUG nova.compute.manager [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-changed-ddbe59f7-465a-458f-a721-e3d5d380e6cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.232 186844 DEBUG nova.compute.manager [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Refreshing instance network info cache due to event network-changed-ddbe59f7-465a-458f-a721-e3d5d380e6cc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.233 186844 DEBUG oslo_concurrency.lockutils [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.234 186844 DEBUG oslo_concurrency.lockutils [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.234 186844 DEBUG nova.network.neutron [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Refreshing network info cache for port ddbe59f7-465a-458f-a721-e3d5d380e6cc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:18:40 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:40.242 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[113605cc-8926-4f99-a5a1-5d1c42b08401]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 350882, 'reachable_time': 40614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219017, 'error': None, 'target': 'ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:40 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:40.245 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8e14168a-35d3-4dd3-9225-5c6c14ef7d52 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:18:40 compute-0 systemd[1]: run-netns-ovnmeta\x2d8e14168a\x2d35d3\x2d4dd3\x2d9225\x2d5c6c14ef7d52.mount: Deactivated successfully.
Feb 27 17:18:40 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:40.245 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[34da79ff-2edd-4e6c-8608-6cf4f8f38631]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.301 186844 DEBUG oslo_concurrency.lockutils [None req-1a70f774-c335-43ea-9b7c-1db4995c545b 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:40 compute-0 podman[219014]: 2026-02-27 17:18:40.348374662 +0000 UTC m=+0.108078499 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.406 186844 DEBUG nova.network.neutron [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.543 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:40 compute-0 nova_compute[186840]: 2026-02-27 17:18:40.996 186844 DEBUG nova.network.neutron [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.010 186844 DEBUG oslo_concurrency.lockutils [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-f087df93-6b03-417d-bc8b-7114adfa61a4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.010 186844 DEBUG nova.compute.manager [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-vif-unplugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.011 186844 DEBUG oslo_concurrency.lockutils [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.011 186844 DEBUG oslo_concurrency.lockutils [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.011 186844 DEBUG oslo_concurrency.lockutils [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.012 186844 DEBUG nova.compute.manager [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] No waiting events found dispatching network-vif-unplugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.012 186844 WARNING nova.compute.manager [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received unexpected event network-vif-unplugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc for instance with vm_state deleted and task_state None.
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.012 186844 DEBUG nova.compute.manager [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-vif-plugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.013 186844 DEBUG oslo_concurrency.lockutils [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.013 186844 DEBUG oslo_concurrency.lockutils [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.013 186844 DEBUG oslo_concurrency.lockutils [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f087df93-6b03-417d-bc8b-7114adfa61a4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.013 186844 DEBUG nova.compute.manager [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] No waiting events found dispatching network-vif-plugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.014 186844 WARNING nova.compute.manager [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received unexpected event network-vif-plugged-ddbe59f7-465a-458f-a721-e3d5d380e6cc for instance with vm_state deleted and task_state None.
Feb 27 17:18:41 compute-0 nova_compute[186840]: 2026-02-27 17:18:41.014 186844 DEBUG nova.compute.manager [req-5d00e0b8-9992-4c44-b3fc-9bcc0bafc360 req-6ba64551-53f2-4a6e-9322-8b9d1e09561f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Received event network-vif-deleted-ddbe59f7-465a-458f-a721-e3d5d380e6cc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:18:43 compute-0 nova_compute[186840]: 2026-02-27 17:18:43.914 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:44 compute-0 podman[219038]: 2026-02-27 17:18:44.685222299 +0000 UTC m=+0.084002197 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, vendor=Red Hat, Inc., version=9.7, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter)
Feb 27 17:18:44 compute-0 podman[219039]: 2026-02-27 17:18:44.754318093 +0000 UTC m=+0.146502407 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 27 17:18:45 compute-0 nova_compute[186840]: 2026-02-27 17:18:45.546 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:46 compute-0 nova_compute[186840]: 2026-02-27 17:18:46.172 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212711.1699305, 1c46d320-cd4f-40ea-ba30-d030e4b745b3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:18:46 compute-0 nova_compute[186840]: 2026-02-27 17:18:46.173 186844 INFO nova.compute.manager [-] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] VM Stopped (Lifecycle Event)
Feb 27 17:18:46 compute-0 nova_compute[186840]: 2026-02-27 17:18:46.204 186844 DEBUG nova.compute.manager [None req-5fdbdbbb-ff77-4d17-90f9-fa560cf67787 - - - - - -] [instance: 1c46d320-cd4f-40ea-ba30-d030e4b745b3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:18:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:47.093 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:18:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:47.093 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:18:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:18:47.093 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:18:48 compute-0 nova_compute[186840]: 2026-02-27 17:18:48.917 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:50 compute-0 nova_compute[186840]: 2026-02-27 17:18:50.548 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:51 compute-0 nova_compute[186840]: 2026-02-27 17:18:51.449 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:51 compute-0 nova_compute[186840]: 2026-02-27 17:18:51.460 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:52 compute-0 podman[219085]: 2026-02-27 17:18:52.66075911 +0000 UTC m=+0.066998883 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223)
Feb 27 17:18:53 compute-0 sshd[131044]: error: beginning MaxStartups throttling
Feb 27 17:18:53 compute-0 sshd[131044]: drop connection #11 from [176.65.139.12]:8844 on [38.129.56.53]:22 past Maxstartups
Feb 27 17:18:53 compute-0 nova_compute[186840]: 2026-02-27 17:18:53.893 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212718.8924754, f087df93-6b03-417d-bc8b-7114adfa61a4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:18:53 compute-0 nova_compute[186840]: 2026-02-27 17:18:53.894 186844 INFO nova.compute.manager [-] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] VM Stopped (Lifecycle Event)
Feb 27 17:18:53 compute-0 nova_compute[186840]: 2026-02-27 17:18:53.916 186844 DEBUG nova.compute.manager [None req-845cd4b5-2a60-433e-9d76-1d2d22c1c7e3 - - - - - -] [instance: f087df93-6b03-417d-bc8b-7114adfa61a4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:18:53 compute-0 nova_compute[186840]: 2026-02-27 17:18:53.920 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:55 compute-0 nova_compute[186840]: 2026-02-27 17:18:55.591 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:18:57 compute-0 podman[219122]: 2026-02-27 17:18:57.673334491 +0000 UTC m=+0.077557287 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:18:58 compute-0 nova_compute[186840]: 2026-02-27 17:18:58.923 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:00 compute-0 nova_compute[186840]: 2026-02-27 17:19:00.634 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:02 compute-0 sshd-session[219114]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219111]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219114]: Connection reset by 176.65.139.12 port 8882
Feb 27 17:19:02 compute-0 sshd-session[219111]: Connection reset by 176.65.139.12 port 8864
Feb 27 17:19:02 compute-0 sshd-session[219119]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219109]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219109]: Connection reset by 176.65.139.12 port 8782
Feb 27 17:19:02 compute-0 sshd-session[219110]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219107]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219120]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219110]: Connection reset by 176.65.139.12 port 8756
Feb 27 17:19:02 compute-0 sshd-session[219120]: Connection reset by 176.65.139.12 port 8850
Feb 27 17:19:02 compute-0 sshd-session[219107]: Connection reset by 176.65.139.12 port 8880
Feb 27 17:19:02 compute-0 sshd-session[219108]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219108]: Connection reset by 176.65.139.12 port 8810
Feb 27 17:19:02 compute-0 sshd-session[219113]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219113]: Connection reset by 176.65.139.12 port 8768
Feb 27 17:19:02 compute-0 sshd-session[219112]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219117]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219121]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219117]: Connection reset by 176.65.139.12 port 8886
Feb 27 17:19:02 compute-0 sshd-session[219112]: Connection reset by 176.65.139.12 port 8832
Feb 27 17:19:02 compute-0 sshd-session[219121]: Connection reset by 176.65.139.12 port 8904
Feb 27 17:19:02 compute-0 sshd-session[219119]: Connection reset by 176.65.139.12 port 8812
Feb 27 17:19:02 compute-0 sshd-session[219115]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219115]: Connection reset by 176.65.139.12 port 8758
Feb 27 17:19:02 compute-0 sshd-session[219116]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219116]: Connection reset by 176.65.139.12 port 8898
Feb 27 17:19:02 compute-0 sshd-session[219118]: error: kex_exchange_identification: read: Connection reset by peer
Feb 27 17:19:02 compute-0 sshd-session[219118]: Connection reset by 176.65.139.12 port 8896
Feb 27 17:19:03 compute-0 nova_compute[186840]: 2026-02-27 17:19:03.926 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:05 compute-0 nova_compute[186840]: 2026-02-27 17:19:05.637 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.380 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "c7a660b7-7953-4afc-96b7-4df30ec0567a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.380 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.396 186844 DEBUG nova.compute.manager [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.489 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.490 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.500 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.501 186844 INFO nova.compute.claims [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.603 186844 DEBUG nova.compute.provider_tree [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.619 186844 DEBUG nova.scheduler.client.report [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.643 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.644 186844 DEBUG nova.compute.manager [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.703 186844 DEBUG nova.compute.manager [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.703 186844 DEBUG nova.network.neutron [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.726 186844 INFO nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.753 186844 DEBUG nova.compute.manager [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.845 186844 DEBUG nova.compute.manager [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.847 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.847 186844 INFO nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Creating image(s)
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.848 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.849 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.850 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.874 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.957 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.958 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.959 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:07 compute-0 nova_compute[186840]: 2026-02-27 17:19:07.980 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.043 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.044 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.110 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk 1073741824" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.112 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.113 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.191 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.192 186844 DEBUG nova.virt.disk.api [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.193 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.256 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.257 186844 DEBUG nova.virt.disk.api [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.258 186844 DEBUG nova.objects.instance [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid c7a660b7-7953-4afc-96b7-4df30ec0567a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.274 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.275 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Ensure instance console log exists: /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.276 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.276 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.277 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:08 compute-0 nova_compute[186840]: 2026-02-27 17:19:08.930 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:09 compute-0 podman[219162]: 2026-02-27 17:19:09.657145863 +0000 UTC m=+0.061140565 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 27 17:19:09 compute-0 nova_compute[186840]: 2026-02-27 17:19:09.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:19:09 compute-0 nova_compute[186840]: 2026-02-27 17:19:09.866 186844 DEBUG nova.policy [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:19:10 compute-0 nova_compute[186840]: 2026-02-27 17:19:10.638 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:10 compute-0 podman[219186]: 2026-02-27 17:19:10.66206845 +0000 UTC m=+0.062233553 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 27 17:19:11 compute-0 nova_compute[186840]: 2026-02-27 17:19:11.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:19:12 compute-0 nova_compute[186840]: 2026-02-27 17:19:12.114 186844 DEBUG nova.network.neutron [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Successfully updated port: bde07ed9-3b87-4da1-9140-a5329259cd8a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:19:12 compute-0 nova_compute[186840]: 2026-02-27 17:19:12.130 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-c7a660b7-7953-4afc-96b7-4df30ec0567a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:19:12 compute-0 nova_compute[186840]: 2026-02-27 17:19:12.131 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-c7a660b7-7953-4afc-96b7-4df30ec0567a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:19:12 compute-0 nova_compute[186840]: 2026-02-27 17:19:12.131 186844 DEBUG nova.network.neutron [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:19:12 compute-0 nova_compute[186840]: 2026-02-27 17:19:12.218 186844 DEBUG nova.compute.manager [req-d0ca3329-211b-423d-9679-7078718d0b49 req-83bb0d66-1b1c-49dd-a4a6-ba7978a3a54d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Received event network-changed-bde07ed9-3b87-4da1-9140-a5329259cd8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:19:12 compute-0 nova_compute[186840]: 2026-02-27 17:19:12.218 186844 DEBUG nova.compute.manager [req-d0ca3329-211b-423d-9679-7078718d0b49 req-83bb0d66-1b1c-49dd-a4a6-ba7978a3a54d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Refreshing instance network info cache due to event network-changed-bde07ed9-3b87-4da1-9140-a5329259cd8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:19:12 compute-0 nova_compute[186840]: 2026-02-27 17:19:12.219 186844 DEBUG oslo_concurrency.lockutils [req-d0ca3329-211b-423d-9679-7078718d0b49 req-83bb0d66-1b1c-49dd-a4a6-ba7978a3a54d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-c7a660b7-7953-4afc-96b7-4df30ec0567a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:19:12 compute-0 nova_compute[186840]: 2026-02-27 17:19:12.346 186844 DEBUG nova.network.neutron [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.016 186844 DEBUG nova.network.neutron [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Updating instance_info_cache with network_info: [{"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.041 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-c7a660b7-7953-4afc-96b7-4df30ec0567a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.042 186844 DEBUG nova.compute.manager [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Instance network_info: |[{"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.042 186844 DEBUG oslo_concurrency.lockutils [req-d0ca3329-211b-423d-9679-7078718d0b49 req-83bb0d66-1b1c-49dd-a4a6-ba7978a3a54d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-c7a660b7-7953-4afc-96b7-4df30ec0567a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.042 186844 DEBUG nova.network.neutron [req-d0ca3329-211b-423d-9679-7078718d0b49 req-83bb0d66-1b1c-49dd-a4a6-ba7978a3a54d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Refreshing network info cache for port bde07ed9-3b87-4da1-9140-a5329259cd8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.045 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Start _get_guest_xml network_info=[{"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.050 186844 WARNING nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.054 186844 DEBUG nova.virt.libvirt.host [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.055 186844 DEBUG nova.virt.libvirt.host [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.061 186844 DEBUG nova.virt.libvirt.host [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.062 186844 DEBUG nova.virt.libvirt.host [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.062 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.062 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.063 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.063 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.063 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.063 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.064 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.064 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.064 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.064 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.065 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.065 186844 DEBUG nova.virt.hardware [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.068 186844 DEBUG nova.virt.libvirt.vif [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:19:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1009731089',display_name='tempest-TestNetworkBasicOps-server-1009731089',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1009731089',id=8,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKzKZ3DZCepYH3wk85vjx4SAt6Ph0pfRU6/g/ko20RNpqXc29rRXWF5FlXf+CZuctx91aLlVYtkXWwkcaApXgI2IzZkS6pyLgMbKhmuW4x65cuQGQyUvGEHBV3VhVOPOpg==',key_name='tempest-TestNetworkBasicOps-1843435151',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-ic7vee00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:19:07Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=c7a660b7-7953-4afc-96b7-4df30ec0567a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.069 186844 DEBUG nova.network.os_vif_util [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.069 186844 DEBUG nova.network.os_vif_util [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.070 186844 DEBUG nova.objects.instance [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid c7a660b7-7953-4afc-96b7-4df30ec0567a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.092 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:19:13 compute-0 nova_compute[186840]:   <uuid>c7a660b7-7953-4afc-96b7-4df30ec0567a</uuid>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   <name>instance-00000008</name>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1009731089</nova:name>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:19:13</nova:creationTime>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:19:13 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:19:13 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:19:13 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:19:13 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:19:13 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:19:13 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:19:13 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:19:13 compute-0 nova_compute[186840]:         <nova:port uuid="bde07ed9-3b87-4da1-9140-a5329259cd8a">
Feb 27 17:19:13 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <system>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <entry name="serial">c7a660b7-7953-4afc-96b7-4df30ec0567a</entry>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <entry name="uuid">c7a660b7-7953-4afc-96b7-4df30ec0567a</entry>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     </system>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   <os>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   </os>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   <features>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   </features>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk.config"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:c5:4e:0b"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <target dev="tapbde07ed9-3b"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/console.log" append="off"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <video>
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     </video>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:19:13 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:19:13 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:19:13 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:19:13 compute-0 nova_compute[186840]: </domain>
Feb 27 17:19:13 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.093 186844 DEBUG nova.compute.manager [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Preparing to wait for external event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.093 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.094 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.094 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.094 186844 DEBUG nova.virt.libvirt.vif [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:19:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1009731089',display_name='tempest-TestNetworkBasicOps-server-1009731089',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1009731089',id=8,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKzKZ3DZCepYH3wk85vjx4SAt6Ph0pfRU6/g/ko20RNpqXc29rRXWF5FlXf+CZuctx91aLlVYtkXWwkcaApXgI2IzZkS6pyLgMbKhmuW4x65cuQGQyUvGEHBV3VhVOPOpg==',key_name='tempest-TestNetworkBasicOps-1843435151',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-ic7vee00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:19:07Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=c7a660b7-7953-4afc-96b7-4df30ec0567a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.095 186844 DEBUG nova.network.os_vif_util [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.095 186844 DEBUG nova.network.os_vif_util [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.096 186844 DEBUG os_vif [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.096 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.097 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.097 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.101 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.101 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbde07ed9-3b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.102 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbde07ed9-3b, col_values=(('external_ids', {'iface-id': 'bde07ed9-3b87-4da1-9140-a5329259cd8a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c5:4e:0b', 'vm-uuid': 'c7a660b7-7953-4afc-96b7-4df30ec0567a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.133 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:13 compute-0 NetworkManager[56537]: <info>  [1772212753.1342] manager: (tapbde07ed9-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.136 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.139 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.141 186844 INFO os_vif [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b')
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.209 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.210 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.210 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:c5:4e:0b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.211 186844 INFO nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Using config drive
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.695 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.714 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.738 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.739 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.739 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.739 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.809 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.887 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.888 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.941 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.943 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000008, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk.config'
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.962 186844 INFO nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Creating config drive at /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk.config
Feb 27 17:19:13 compute-0 nova_compute[186840]: 2026-02-27 17:19:13.968 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp6esysbne execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.093 186844 DEBUG oslo_concurrency.processutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp6esysbne" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:14 compute-0 kernel: tapbde07ed9-3b: entered promiscuous mode
Feb 27 17:19:14 compute-0 NetworkManager[56537]: <info>  [1772212754.1649] manager: (tapbde07ed9-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Feb 27 17:19:14 compute-0 systemd-udevd[219229]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:19:14 compute-0 ovn_controller[96756]: 2026-02-27T17:19:14Z|00108|binding|INFO|Claiming lport bde07ed9-3b87-4da1-9140-a5329259cd8a for this chassis.
Feb 27 17:19:14 compute-0 ovn_controller[96756]: 2026-02-27T17:19:14Z|00109|binding|INFO|bde07ed9-3b87-4da1-9140-a5329259cd8a: Claiming fa:16:3e:c5:4e:0b 10.100.0.5
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.227 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:14 compute-0 NetworkManager[56537]: <info>  [1772212754.2411] device (tapbde07ed9-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:19:14 compute-0 NetworkManager[56537]: <info>  [1772212754.2418] device (tapbde07ed9-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:19:14 compute-0 systemd-machined[156136]: New machine qemu-8-instance-00000008.
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.250 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:4e:0b 10.100.0.5'], port_security=['fa:16:3e:c5:4e:0b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1464661183', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c7a660b7-7953-4afc-96b7-4df30ec0567a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1464661183', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c7cddf1d-a2b6-4e60-80f6-1d5a07563c5c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c49604e-c0c7-417f-9298-de96727088ed, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=bde07ed9-3b87-4da1-9140-a5329259cd8a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.252 106085 INFO neutron.agent.ovn.metadata.agent [-] Port bde07ed9-3b87-4da1-9140-a5329259cd8a in datapath 18593015-aa72-4746-ac6a-5ce1ea63dc3d bound to our chassis
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.253 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 18593015-aa72-4746-ac6a-5ce1ea63dc3d
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.266 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3517b5-67ad-403e-b551-6b5eded727c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.267 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap18593015-a1 in ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:19:14 compute-0 ovn_controller[96756]: 2026-02-27T17:19:14Z|00110|binding|INFO|Setting lport bde07ed9-3b87-4da1-9140-a5329259cd8a ovn-installed in OVS
Feb 27 17:19:14 compute-0 ovn_controller[96756]: 2026-02-27T17:19:14Z|00111|binding|INFO|Setting lport bde07ed9-3b87-4da1-9140-a5329259cd8a up in Southbound
Feb 27 17:19:14 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000008.
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.268 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.270 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap18593015-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.270 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[7293ddc0-b7c6-40d6-af13-d8659e337743]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.272 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[cca10a65-b5bd-4b9f-8633-de42515bc277]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.275 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.277 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5779MB free_disk=73.1937370300293GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.277 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.278 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.283 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[3e40a72c-30d6-4406-8df3-8fe2af83193f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.297 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[2f8c183e-301a-49a3-b972-351da2a10790]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.319 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[e22b499c-7370-42fa-93e4-b7847d5ce610]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 systemd-udevd[219234]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.325 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[ac0859c3-a32f-40dd-adb3-b33786a8850f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 NetworkManager[56537]: <info>  [1772212754.3274] manager: (tap18593015-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/63)
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.352 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[aab2f64e-ad75-4588-89ed-7e3ea2440825]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.357 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[3c6822af-af9b-4a91-b860-1a505e2275b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 NetworkManager[56537]: <info>  [1772212754.3747] device (tap18593015-a0): carrier: link connected
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.378 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[2489b906-c3ae-46c3-b77b-ebaf4b2f99b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.387 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Instance c7a660b7-7953-4afc-96b7-4df30ec0567a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.387 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.387 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.398 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[2a98fa0c-e755-435f-9828-60e097e551fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap18593015-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:d3:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362986, 'reachable_time': 43347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219265, 'error': None, 'target': 'ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.414 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[9468163b-3eb4-4c08-81f0-08e41f5b2fe5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feef:d372'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 362986, 'tstamp': 362986}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219266, 'error': None, 'target': 'ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.429 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[7b04ad89-abc5-49ae-84b6-f3c3da828449]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap18593015-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:d3:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362986, 'reachable_time': 43347, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219267, 'error': None, 'target': 'ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.438 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.454 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.471 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[edd6ddc0-5510-4d4a-aa25-fe82bc4fae59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.476 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.477 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.199s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.521 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[5084fb1b-ba58-45ad-b89e-dfa0c982c9ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.523 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18593015-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.523 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.524 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap18593015-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.526 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:14 compute-0 NetworkManager[56537]: <info>  [1772212754.5274] manager: (tap18593015-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Feb 27 17:19:14 compute-0 kernel: tap18593015-a0: entered promiscuous mode
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.529 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap18593015-a0, col_values=(('external_ids', {'iface-id': 'abed0f28-89ab-4d2d-96b7-a971668f4f6e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.530 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.532 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.532 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/18593015-aa72-4746-ac6a-5ce1ea63dc3d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/18593015-aa72-4746-ac6a-5ce1ea63dc3d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:19:14 compute-0 ovn_controller[96756]: 2026-02-27T17:19:14Z|00112|binding|INFO|Releasing lport abed0f28-89ab-4d2d-96b7-a971668f4f6e from this chassis (sb_readonly=0)
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.533 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[2256d1ca-f052-47d3-b3d6-6a7bf70eaa53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.534 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-18593015-aa72-4746-ac6a-5ce1ea63dc3d
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/18593015-aa72-4746-ac6a-5ce1ea63dc3d.pid.haproxy
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID 18593015-aa72-4746-ac6a-5ce1ea63dc3d
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:19:14 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:14.536 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'env', 'PROCESS_TAG=haproxy-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/18593015-aa72-4746-ac6a-5ce1ea63dc3d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.542 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.662 186844 DEBUG nova.compute.manager [req-5497f66e-da09-40b6-8b1b-95bc08cc2d18 req-d1367ca4-ea35-4ed2-854b-cb7dc0bd2f13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Received event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.663 186844 DEBUG oslo_concurrency.lockutils [req-5497f66e-da09-40b6-8b1b-95bc08cc2d18 req-d1367ca4-ea35-4ed2-854b-cb7dc0bd2f13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.663 186844 DEBUG oslo_concurrency.lockutils [req-5497f66e-da09-40b6-8b1b-95bc08cc2d18 req-d1367ca4-ea35-4ed2-854b-cb7dc0bd2f13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.664 186844 DEBUG oslo_concurrency.lockutils [req-5497f66e-da09-40b6-8b1b-95bc08cc2d18 req-d1367ca4-ea35-4ed2-854b-cb7dc0bd2f13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.664 186844 DEBUG nova.compute.manager [req-5497f66e-da09-40b6-8b1b-95bc08cc2d18 req-d1367ca4-ea35-4ed2-854b-cb7dc0bd2f13 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Processing event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.899 186844 DEBUG nova.network.neutron [req-d0ca3329-211b-423d-9679-7078718d0b49 req-83bb0d66-1b1c-49dd-a4a6-ba7978a3a54d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Updated VIF entry in instance network info cache for port bde07ed9-3b87-4da1-9140-a5329259cd8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.900 186844 DEBUG nova.network.neutron [req-d0ca3329-211b-423d-9679-7078718d0b49 req-83bb0d66-1b1c-49dd-a4a6-ba7978a3a54d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Updating instance_info_cache with network_info: [{"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.919 186844 DEBUG oslo_concurrency.lockutils [req-d0ca3329-211b-423d-9679-7078718d0b49 req-83bb0d66-1b1c-49dd-a4a6-ba7978a3a54d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-c7a660b7-7953-4afc-96b7-4df30ec0567a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.935 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212754.934731, c7a660b7-7953-4afc-96b7-4df30ec0567a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.935 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] VM Started (Lifecycle Event)
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.938 186844 DEBUG nova.compute.manager [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.944 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:19:14 compute-0 podman[219302]: 2026-02-27 17:19:14.945469454 +0000 UTC m=+0.079842471 container create 209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.948 186844 INFO nova.virt.libvirt.driver [-] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Instance spawned successfully.
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.949 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.958 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.967 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.970 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.972 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.972 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.973 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.973 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:14 compute-0 nova_compute[186840]: 2026-02-27 17:19:14.974 186844 DEBUG nova.virt.libvirt.driver [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:14 compute-0 systemd[1]: Started libpod-conmon-209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff.scope.
Feb 27 17:19:15 compute-0 podman[219302]: 2026-02-27 17:19:14.910155136 +0000 UTC m=+0.044528223 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.002 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.003 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212754.9349992, c7a660b7-7953-4afc-96b7-4df30ec0567a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.003 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] VM Paused (Lifecycle Event)
Feb 27 17:19:15 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:19:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f98b1294253e5846b80bdf1cc571d22bf6e26da358b37574c051c5691767f087/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.033 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.037 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212754.9429512, c7a660b7-7953-4afc-96b7-4df30ec0567a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.037 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] VM Resumed (Lifecycle Event)
Feb 27 17:19:15 compute-0 podman[219302]: 2026-02-27 17:19:15.042060889 +0000 UTC m=+0.176433906 container init 209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:19:15 compute-0 podman[219302]: 2026-02-27 17:19:15.045809734 +0000 UTC m=+0.180182721 container start 209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.046 186844 INFO nova.compute.manager [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Took 7.20 seconds to spawn the instance on the hypervisor.
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.047 186844 DEBUG nova.compute.manager [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.057 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.059 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:19:15 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219335]: [NOTICE]   (219359) : New worker (219365) forked
Feb 27 17:19:15 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219335]: [NOTICE]   (219359) : Loading success.
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.085 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:19:15 compute-0 podman[219322]: 2026-02-27 17:19:15.087194447 +0000 UTC m=+0.099182443 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 27 17:19:15 compute-0 podman[219319]: 2026-02-27 17:19:15.096964375 +0000 UTC m=+0.104920608 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1770267347, vendor=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.106 186844 INFO nova.compute.manager [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Took 7.66 seconds to build instance.
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.123 186844 DEBUG oslo_concurrency.lockutils [None req-239145b6-0a20-4679-ad83-bfb24ee12248 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.461 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.462 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.462 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.480 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.481 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:19:15 compute-0 nova_compute[186840]: 2026-02-27 17:19:15.641 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:16 compute-0 nova_compute[186840]: 2026-02-27 17:19:16.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:19:16 compute-0 nova_compute[186840]: 2026-02-27 17:19:16.762 186844 DEBUG nova.compute.manager [req-ff2e82dc-e385-4a02-8224-11b9fa8f521a req-115489bb-9264-493f-9eb1-94385e647543 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Received event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:19:16 compute-0 nova_compute[186840]: 2026-02-27 17:19:16.763 186844 DEBUG oslo_concurrency.lockutils [req-ff2e82dc-e385-4a02-8224-11b9fa8f521a req-115489bb-9264-493f-9eb1-94385e647543 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:16 compute-0 nova_compute[186840]: 2026-02-27 17:19:16.763 186844 DEBUG oslo_concurrency.lockutils [req-ff2e82dc-e385-4a02-8224-11b9fa8f521a req-115489bb-9264-493f-9eb1-94385e647543 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:16 compute-0 nova_compute[186840]: 2026-02-27 17:19:16.763 186844 DEBUG oslo_concurrency.lockutils [req-ff2e82dc-e385-4a02-8224-11b9fa8f521a req-115489bb-9264-493f-9eb1-94385e647543 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:16 compute-0 nova_compute[186840]: 2026-02-27 17:19:16.764 186844 DEBUG nova.compute.manager [req-ff2e82dc-e385-4a02-8224-11b9fa8f521a req-115489bb-9264-493f-9eb1-94385e647543 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] No waiting events found dispatching network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:19:16 compute-0 nova_compute[186840]: 2026-02-27 17:19:16.764 186844 WARNING nova.compute.manager [req-ff2e82dc-e385-4a02-8224-11b9fa8f521a req-115489bb-9264-493f-9eb1-94385e647543 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Received unexpected event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a for instance with vm_state active and task_state None.
Feb 27 17:19:18 compute-0 ovn_controller[96756]: 2026-02-27T17:19:18Z|00113|binding|INFO|Releasing lport abed0f28-89ab-4d2d-96b7-a971668f4f6e from this chassis (sb_readonly=0)
Feb 27 17:19:18 compute-0 nova_compute[186840]: 2026-02-27 17:19:18.144 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:18 compute-0 NetworkManager[56537]: <info>  [1772212758.1487] manager: (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/65)
Feb 27 17:19:18 compute-0 nova_compute[186840]: 2026-02-27 17:19:18.148 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:18 compute-0 NetworkManager[56537]: <info>  [1772212758.1495] manager: (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Feb 27 17:19:18 compute-0 ovn_controller[96756]: 2026-02-27T17:19:18Z|00114|binding|INFO|Releasing lport abed0f28-89ab-4d2d-96b7-a971668f4f6e from this chassis (sb_readonly=0)
Feb 27 17:19:18 compute-0 nova_compute[186840]: 2026-02-27 17:19:18.149 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:18 compute-0 nova_compute[186840]: 2026-02-27 17:19:18.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:19:18 compute-0 nova_compute[186840]: 2026-02-27 17:19:18.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:19:18 compute-0 nova_compute[186840]: 2026-02-27 17:19:18.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:19:18 compute-0 nova_compute[186840]: 2026-02-27 17:19:18.916 186844 DEBUG nova.compute.manager [req-f8711f5f-e942-411a-934c-df0bb5cfb9c4 req-50b1dce0-4611-444f-a939-f40a4ed99d4b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Received event network-changed-bde07ed9-3b87-4da1-9140-a5329259cd8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:19:18 compute-0 nova_compute[186840]: 2026-02-27 17:19:18.917 186844 DEBUG nova.compute.manager [req-f8711f5f-e942-411a-934c-df0bb5cfb9c4 req-50b1dce0-4611-444f-a939-f40a4ed99d4b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Refreshing instance network info cache due to event network-changed-bde07ed9-3b87-4da1-9140-a5329259cd8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:19:18 compute-0 nova_compute[186840]: 2026-02-27 17:19:18.918 186844 DEBUG oslo_concurrency.lockutils [req-f8711f5f-e942-411a-934c-df0bb5cfb9c4 req-50b1dce0-4611-444f-a939-f40a4ed99d4b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-c7a660b7-7953-4afc-96b7-4df30ec0567a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:19:18 compute-0 nova_compute[186840]: 2026-02-27 17:19:18.919 186844 DEBUG oslo_concurrency.lockutils [req-f8711f5f-e942-411a-934c-df0bb5cfb9c4 req-50b1dce0-4611-444f-a939-f40a4ed99d4b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-c7a660b7-7953-4afc-96b7-4df30ec0567a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:19:18 compute-0 nova_compute[186840]: 2026-02-27 17:19:18.919 186844 DEBUG nova.network.neutron [req-f8711f5f-e942-411a-934c-df0bb5cfb9c4 req-50b1dce0-4611-444f-a939-f40a4ed99d4b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Refreshing network info cache for port bde07ed9-3b87-4da1-9140-a5329259cd8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.009 186844 DEBUG oslo_concurrency.lockutils [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "c7a660b7-7953-4afc-96b7-4df30ec0567a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.010 186844 DEBUG oslo_concurrency.lockutils [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.011 186844 DEBUG oslo_concurrency.lockutils [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.011 186844 DEBUG oslo_concurrency.lockutils [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.012 186844 DEBUG oslo_concurrency.lockutils [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.013 186844 INFO nova.compute.manager [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Terminating instance
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.015 186844 DEBUG nova.compute.manager [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:19:19 compute-0 kernel: tapbde07ed9-3b (unregistering): left promiscuous mode
Feb 27 17:19:19 compute-0 NetworkManager[56537]: <info>  [1772212759.0373] device (tapbde07ed9-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:19:19 compute-0 ovn_controller[96756]: 2026-02-27T17:19:19Z|00115|binding|INFO|Releasing lport bde07ed9-3b87-4da1-9140-a5329259cd8a from this chassis (sb_readonly=0)
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.045 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:19 compute-0 ovn_controller[96756]: 2026-02-27T17:19:19Z|00116|binding|INFO|Setting lport bde07ed9-3b87-4da1-9140-a5329259cd8a down in Southbound
Feb 27 17:19:19 compute-0 ovn_controller[96756]: 2026-02-27T17:19:19Z|00117|binding|INFO|Removing iface tapbde07ed9-3b ovn-installed in OVS
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.051 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:4e:0b 10.100.0.5'], port_security=['fa:16:3e:c5:4e:0b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1464661183', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'c7a660b7-7953-4afc-96b7-4df30ec0567a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1464661183', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c7cddf1d-a2b6-4e60-80f6-1d5a07563c5c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c49604e-c0c7-417f-9298-de96727088ed, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=bde07ed9-3b87-4da1-9140-a5329259cd8a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.053 106085 INFO neutron.agent.ovn.metadata.agent [-] Port bde07ed9-3b87-4da1-9140-a5329259cd8a in datapath 18593015-aa72-4746-ac6a-5ce1ea63dc3d unbound from our chassis
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.054 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 18593015-aa72-4746-ac6a-5ce1ea63dc3d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.056 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.055 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3296274f-b3dc-4f38-b3a6-35d3864aabf7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.056 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d namespace which is not needed anymore
Feb 27 17:19:19 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Deactivated successfully.
Feb 27 17:19:19 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000008.scope: Consumed 4.765s CPU time.
Feb 27 17:19:19 compute-0 systemd-machined[156136]: Machine qemu-8-instance-00000008 terminated.
Feb 27 17:19:19 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219335]: [NOTICE]   (219359) : haproxy version is 2.8.14-c23fe91
Feb 27 17:19:19 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219335]: [NOTICE]   (219359) : path to executable is /usr/sbin/haproxy
Feb 27 17:19:19 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219335]: [WARNING]  (219359) : Exiting Master process...
Feb 27 17:19:19 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219335]: [ALERT]    (219359) : Current worker (219365) exited with code 143 (Terminated)
Feb 27 17:19:19 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219335]: [WARNING]  (219359) : All workers exited. Exiting... (0)
Feb 27 17:19:19 compute-0 systemd[1]: libpod-209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff.scope: Deactivated successfully.
Feb 27 17:19:19 compute-0 podman[219411]: 2026-02-27 17:19:19.182435436 +0000 UTC m=+0.059166655 container died 209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.252 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.286 186844 INFO nova.virt.libvirt.driver [-] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Instance destroyed successfully.
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.287 186844 DEBUG nova.objects.instance [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid c7a660b7-7953-4afc-96b7-4df30ec0567a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.303 186844 DEBUG nova.virt.libvirt.vif [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:19:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1009731089',display_name='tempest-TestNetworkBasicOps-server-1009731089',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1009731089',id=8,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKzKZ3DZCepYH3wk85vjx4SAt6Ph0pfRU6/g/ko20RNpqXc29rRXWF5FlXf+CZuctx91aLlVYtkXWwkcaApXgI2IzZkS6pyLgMbKhmuW4x65cuQGQyUvGEHBV3VhVOPOpg==',key_name='tempest-TestNetworkBasicOps-1843435151',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:19:15Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-ic7vee00',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:19:15Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=c7a660b7-7953-4afc-96b7-4df30ec0567a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.303 186844 DEBUG nova.network.os_vif_util [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.304 186844 DEBUG nova.network.os_vif_util [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.305 186844 DEBUG os_vif [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.307 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.308 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbde07ed9-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.309 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.312 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.315 186844 INFO os_vif [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b')
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.316 186844 INFO nova.virt.libvirt.driver [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Deleting instance files /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a_del
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.317 186844 INFO nova.virt.libvirt.driver [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Deletion of /var/lib/nova/instances/c7a660b7-7953-4afc-96b7-4df30ec0567a_del complete
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.373 186844 INFO nova.compute.manager [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Took 0.36 seconds to destroy the instance on the hypervisor.
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.374 186844 DEBUG oslo.service.loopingcall [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.375 186844 DEBUG nova.compute.manager [-] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.376 186844 DEBUG nova.network.neutron [-] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:19:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff-userdata-shm.mount: Deactivated successfully.
Feb 27 17:19:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-f98b1294253e5846b80bdf1cc571d22bf6e26da358b37574c051c5691767f087-merged.mount: Deactivated successfully.
Feb 27 17:19:19 compute-0 podman[219411]: 2026-02-27 17:19:19.464537507 +0000 UTC m=+0.341268696 container cleanup 209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 27 17:19:19 compute-0 systemd[1]: libpod-conmon-209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff.scope: Deactivated successfully.
Feb 27 17:19:19 compute-0 podman[219458]: 2026-02-27 17:19:19.563905173 +0000 UTC m=+0.074766671 container remove 209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.569 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[fe7d2aaa-89bf-42da-9e4f-84ac659e60ad]: (4, ('Fri Feb 27 05:19:19 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d (209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff)\n209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff\nFri Feb 27 05:19:19 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d (209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff)\n209ffb68ddc1d2e5ab22d6feeda96044d99ad3aac94ed5f807bcbf53c1f98bff\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.571 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf95e16-1996-4f94-b238-12359f011bc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.572 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18593015-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.575 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:19 compute-0 kernel: tap18593015-a0: left promiscuous mode
Feb 27 17:19:19 compute-0 nova_compute[186840]: 2026-02-27 17:19:19.580 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.583 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[4b247d0a-cbac-4889-a50f-eb06b62a3e7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.606 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[4069d0d4-84fa-49c8-9618-e3299c3a41d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.608 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3feb200c-52cf-41bf-bbb2-8137f84767e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.624 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3960d5fc-199e-4e56-997d-09f5d57dc0ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362980, 'reachable_time': 31601, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219473, 'error': None, 'target': 'ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.627 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:19:19 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:19.627 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[a19cbeb6-a812-41cc-962b-8d4690d857cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d18593015\x2daa72\x2d4746\x2dac6a\x2d5ce1ea63dc3d.mount: Deactivated successfully.
Feb 27 17:19:20 compute-0 nova_compute[186840]: 2026-02-27 17:19:20.682 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.170 186844 DEBUG nova.network.neutron [req-f8711f5f-e942-411a-934c-df0bb5cfb9c4 req-50b1dce0-4611-444f-a939-f40a4ed99d4b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Updated VIF entry in instance network info cache for port bde07ed9-3b87-4da1-9140-a5329259cd8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.171 186844 DEBUG nova.network.neutron [req-f8711f5f-e942-411a-934c-df0bb5cfb9c4 req-50b1dce0-4611-444f-a939-f40a4ed99d4b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Updating instance_info_cache with network_info: [{"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.175 186844 DEBUG nova.compute.manager [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Received event network-vif-unplugged-bde07ed9-3b87-4da1-9140-a5329259cd8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.176 186844 DEBUG oslo_concurrency.lockutils [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.176 186844 DEBUG oslo_concurrency.lockutils [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.177 186844 DEBUG oslo_concurrency.lockutils [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.177 186844 DEBUG nova.compute.manager [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] No waiting events found dispatching network-vif-unplugged-bde07ed9-3b87-4da1-9140-a5329259cd8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.177 186844 DEBUG nova.compute.manager [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Received event network-vif-unplugged-bde07ed9-3b87-4da1-9140-a5329259cd8a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.178 186844 DEBUG nova.compute.manager [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Received event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.178 186844 DEBUG oslo_concurrency.lockutils [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.178 186844 DEBUG oslo_concurrency.lockutils [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.179 186844 DEBUG oslo_concurrency.lockutils [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.179 186844 DEBUG nova.compute.manager [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] No waiting events found dispatching network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.180 186844 WARNING nova.compute.manager [req-6e34fc33-c184-4597-9a04-7c6904812aa3 req-2884bde4-e7b8-42a4-95f5-7f7ab4d59741 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Received unexpected event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a for instance with vm_state active and task_state deleting.
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.189 186844 DEBUG oslo_concurrency.lockutils [req-f8711f5f-e942-411a-934c-df0bb5cfb9c4 req-50b1dce0-4611-444f-a939-f40a4ed99d4b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-c7a660b7-7953-4afc-96b7-4df30ec0567a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.372 186844 DEBUG nova.network.neutron [-] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.387 186844 INFO nova.compute.manager [-] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Took 2.01 seconds to deallocate network for instance.
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.444 186844 DEBUG oslo_concurrency.lockutils [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.444 186844 DEBUG oslo_concurrency.lockutils [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.506 186844 DEBUG nova.compute.provider_tree [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.530 186844 DEBUG nova.scheduler.client.report [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.560 186844 DEBUG oslo_concurrency.lockutils [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.594 186844 INFO nova.scheduler.client.report [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance c7a660b7-7953-4afc-96b7-4df30ec0567a
Feb 27 17:19:21 compute-0 nova_compute[186840]: 2026-02-27 17:19:21.662 186844 DEBUG oslo_concurrency.lockutils [None req-6dbe1108-9fbf-4a80-b081-c4d0363cc000 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "c7a660b7-7953-4afc-96b7-4df30ec0567a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:23 compute-0 podman[219474]: 2026-02-27 17:19:23.680338792 +0000 UTC m=+0.081214106 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 27 17:19:24 compute-0 nova_compute[186840]: 2026-02-27 17:19:24.358 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:25 compute-0 nova_compute[186840]: 2026-02-27 17:19:25.684 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:26 compute-0 nova_compute[186840]: 2026-02-27 17:19:26.791 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:26 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:26.792 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:19:26 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:26.793 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:19:28 compute-0 podman[219495]: 2026-02-27 17:19:28.662509719 +0000 UTC m=+0.062704185 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 27 17:19:28 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:28.795 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:29 compute-0 nova_compute[186840]: 2026-02-27 17:19:29.360 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:30 compute-0 nova_compute[186840]: 2026-02-27 17:19:30.685 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.144 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.145 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.168 186844 DEBUG nova.compute.manager [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.273 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.274 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.285 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.285 186844 INFO nova.compute.claims [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.289 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212759.2818391, c7a660b7-7953-4afc-96b7-4df30ec0567a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.290 186844 INFO nova.compute.manager [-] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] VM Stopped (Lifecycle Event)
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.345 186844 DEBUG nova.compute.manager [None req-926c78ab-7495-4afa-a986-f2779deae052 - - - - - -] [instance: c7a660b7-7953-4afc-96b7-4df30ec0567a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.366 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.427 186844 DEBUG nova.compute.provider_tree [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.446 186844 DEBUG nova.scheduler.client.report [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.478 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.479 186844 DEBUG nova.compute.manager [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.566 186844 DEBUG nova.compute.manager [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.567 186844 DEBUG nova.network.neutron [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.593 186844 INFO nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.615 186844 DEBUG nova.compute.manager [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.732 186844 DEBUG nova.compute.manager [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.733 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.733 186844 INFO nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Creating image(s)
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.734 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.734 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.735 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.753 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.831 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.832 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.833 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.856 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.898 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:34 compute-0 nova_compute[186840]: 2026-02-27 17:19:34.899 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.020 186844 DEBUG nova.policy [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.159 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk 1073741824" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.160 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.161 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.241 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.242 186844 DEBUG nova.virt.disk.api [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.242 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.317 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.318 186844 DEBUG nova.virt.disk.api [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.318 186844 DEBUG nova.objects.instance [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid f7b4ac29-bdcd-429a-b61c-01753b15d3da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.335 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.335 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Ensure instance console log exists: /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.335 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.336 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.336 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:35 compute-0 nova_compute[186840]: 2026-02-27 17:19:35.687 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:36 compute-0 nova_compute[186840]: 2026-02-27 17:19:36.054 186844 DEBUG nova.network.neutron [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Successfully updated port: bde07ed9-3b87-4da1-9140-a5329259cd8a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:19:36 compute-0 nova_compute[186840]: 2026-02-27 17:19:36.074 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-f7b4ac29-bdcd-429a-b61c-01753b15d3da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:19:36 compute-0 nova_compute[186840]: 2026-02-27 17:19:36.074 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-f7b4ac29-bdcd-429a-b61c-01753b15d3da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:19:36 compute-0 nova_compute[186840]: 2026-02-27 17:19:36.075 186844 DEBUG nova.network.neutron [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:19:36 compute-0 nova_compute[186840]: 2026-02-27 17:19:36.159 186844 DEBUG nova.compute.manager [req-03aee4b3-d5eb-44b6-a04a-680f45f9d1c6 req-98f973aa-2b40-4d73-9ff8-cc4df145aa7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Received event network-changed-bde07ed9-3b87-4da1-9140-a5329259cd8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:19:36 compute-0 nova_compute[186840]: 2026-02-27 17:19:36.160 186844 DEBUG nova.compute.manager [req-03aee4b3-d5eb-44b6-a04a-680f45f9d1c6 req-98f973aa-2b40-4d73-9ff8-cc4df145aa7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Refreshing instance network info cache due to event network-changed-bde07ed9-3b87-4da1-9140-a5329259cd8a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:19:36 compute-0 nova_compute[186840]: 2026-02-27 17:19:36.161 186844 DEBUG oslo_concurrency.lockutils [req-03aee4b3-d5eb-44b6-a04a-680f45f9d1c6 req-98f973aa-2b40-4d73-9ff8-cc4df145aa7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-f7b4ac29-bdcd-429a-b61c-01753b15d3da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:19:36 compute-0 nova_compute[186840]: 2026-02-27 17:19:36.249 186844 DEBUG nova.network.neutron [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.825 186844 DEBUG nova.network.neutron [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Updating instance_info_cache with network_info: [{"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.860 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-f7b4ac29-bdcd-429a-b61c-01753b15d3da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.861 186844 DEBUG nova.compute.manager [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Instance network_info: |[{"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.862 186844 DEBUG oslo_concurrency.lockutils [req-03aee4b3-d5eb-44b6-a04a-680f45f9d1c6 req-98f973aa-2b40-4d73-9ff8-cc4df145aa7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-f7b4ac29-bdcd-429a-b61c-01753b15d3da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.862 186844 DEBUG nova.network.neutron [req-03aee4b3-d5eb-44b6-a04a-680f45f9d1c6 req-98f973aa-2b40-4d73-9ff8-cc4df145aa7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Refreshing network info cache for port bde07ed9-3b87-4da1-9140-a5329259cd8a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.868 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Start _get_guest_xml network_info=[{"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.874 186844 WARNING nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.885 186844 DEBUG nova.virt.libvirt.host [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.885 186844 DEBUG nova.virt.libvirt.host [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.892 186844 DEBUG nova.virt.libvirt.host [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.893 186844 DEBUG nova.virt.libvirt.host [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.893 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.894 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.895 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.895 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.896 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.896 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.896 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.897 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.897 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.898 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.898 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.899 186844 DEBUG nova.virt.hardware [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.905 186844 DEBUG nova.virt.libvirt.vif [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:19:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1706788226',display_name='tempest-TestNetworkBasicOps-server-1706788226',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1706788226',id=9,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLkLitbV4uK7efSmejh8tHcMQu8yY8NlEPUnPj0a15bPHKkOfUWHmQ7HvadZeaRdQXGWV8mf/kYIMab2EzfSGP5qCilcsn5EfXSrXwFx1fVhUcK2RWGBg3iqzsVoeTsivw==',key_name='tempest-TestNetworkBasicOps-1855572024',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-sry3uzop',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:19:34Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f7b4ac29-bdcd-429a-b61c-01753b15d3da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.906 186844 DEBUG nova.network.os_vif_util [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.907 186844 DEBUG nova.network.os_vif_util [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.908 186844 DEBUG nova.objects.instance [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid f7b4ac29-bdcd-429a-b61c-01753b15d3da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.928 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:19:38 compute-0 nova_compute[186840]:   <uuid>f7b4ac29-bdcd-429a-b61c-01753b15d3da</uuid>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   <name>instance-00000009</name>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1706788226</nova:name>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:19:38</nova:creationTime>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:19:38 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:19:38 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:19:38 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:19:38 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:19:38 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:19:38 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:19:38 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:19:38 compute-0 nova_compute[186840]:         <nova:port uuid="bde07ed9-3b87-4da1-9140-a5329259cd8a">
Feb 27 17:19:38 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <system>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <entry name="serial">f7b4ac29-bdcd-429a-b61c-01753b15d3da</entry>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <entry name="uuid">f7b4ac29-bdcd-429a-b61c-01753b15d3da</entry>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     </system>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   <os>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   </os>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   <features>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   </features>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk.config"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:c5:4e:0b"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <target dev="tapbde07ed9-3b"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/console.log" append="off"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <video>
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     </video>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:19:38 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:19:38 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:19:38 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:19:38 compute-0 nova_compute[186840]: </domain>
Feb 27 17:19:38 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.929 186844 DEBUG nova.compute.manager [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Preparing to wait for external event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.930 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.930 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.930 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.931 186844 DEBUG nova.virt.libvirt.vif [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:19:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1706788226',display_name='tempest-TestNetworkBasicOps-server-1706788226',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1706788226',id=9,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLkLitbV4uK7efSmejh8tHcMQu8yY8NlEPUnPj0a15bPHKkOfUWHmQ7HvadZeaRdQXGWV8mf/kYIMab2EzfSGP5qCilcsn5EfXSrXwFx1fVhUcK2RWGBg3iqzsVoeTsivw==',key_name='tempest-TestNetworkBasicOps-1855572024',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-sry3uzop',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:19:34Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f7b4ac29-bdcd-429a-b61c-01753b15d3da,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.931 186844 DEBUG nova.network.os_vif_util [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.932 186844 DEBUG nova.network.os_vif_util [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.932 186844 DEBUG os_vif [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.933 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.933 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.933 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.936 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.936 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbde07ed9-3b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.937 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbde07ed9-3b, col_values=(('external_ids', {'iface-id': 'bde07ed9-3b87-4da1-9140-a5329259cd8a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c5:4e:0b', 'vm-uuid': 'f7b4ac29-bdcd-429a-b61c-01753b15d3da'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.938 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:38 compute-0 NetworkManager[56537]: <info>  [1772212778.9398] manager: (tapbde07ed9-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.942 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.944 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:38 compute-0 nova_compute[186840]: 2026-02-27 17:19:38.946 186844 INFO os_vif [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b')
Feb 27 17:19:39 compute-0 nova_compute[186840]: 2026-02-27 17:19:39.010 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:19:39 compute-0 nova_compute[186840]: 2026-02-27 17:19:39.010 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:19:39 compute-0 nova_compute[186840]: 2026-02-27 17:19:39.010 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:c5:4e:0b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:19:39 compute-0 nova_compute[186840]: 2026-02-27 17:19:39.011 186844 INFO nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Using config drive
Feb 27 17:19:40 compute-0 podman[219537]: 2026-02-27 17:19:40.66829966 +0000 UTC m=+0.069769005 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:19:40 compute-0 nova_compute[186840]: 2026-02-27 17:19:40.719 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:40 compute-0 podman[219561]: 2026-02-27 17:19:40.76153697 +0000 UTC m=+0.064976933 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 27 17:19:40 compute-0 nova_compute[186840]: 2026-02-27 17:19:40.811 186844 INFO nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Creating config drive at /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk.config
Feb 27 17:19:40 compute-0 nova_compute[186840]: 2026-02-27 17:19:40.817 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmper6zolnv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:19:40 compute-0 nova_compute[186840]: 2026-02-27 17:19:40.935 186844 DEBUG oslo_concurrency.processutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmper6zolnv" returned: 0 in 0.118s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:19:40 compute-0 kernel: tapbde07ed9-3b: entered promiscuous mode
Feb 27 17:19:40 compute-0 NetworkManager[56537]: <info>  [1772212780.9948] manager: (tapbde07ed9-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Feb 27 17:19:40 compute-0 ovn_controller[96756]: 2026-02-27T17:19:40Z|00118|binding|INFO|Claiming lport bde07ed9-3b87-4da1-9140-a5329259cd8a for this chassis.
Feb 27 17:19:40 compute-0 ovn_controller[96756]: 2026-02-27T17:19:40Z|00119|binding|INFO|bde07ed9-3b87-4da1-9140-a5329259cd8a: Claiming fa:16:3e:c5:4e:0b 10.100.0.5
Feb 27 17:19:40 compute-0 nova_compute[186840]: 2026-02-27 17:19:40.996 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:41 compute-0 ovn_controller[96756]: 2026-02-27T17:19:41Z|00120|binding|INFO|Setting lport bde07ed9-3b87-4da1-9140-a5329259cd8a ovn-installed in OVS
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.004 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:41 compute-0 ovn_controller[96756]: 2026-02-27T17:19:41Z|00121|binding|INFO|Setting lport bde07ed9-3b87-4da1-9140-a5329259cd8a up in Southbound
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.006 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:4e:0b 10.100.0.5'], port_security=['fa:16:3e:c5:4e:0b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1464661183', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f7b4ac29-bdcd-429a-b61c-01753b15d3da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1464661183', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'c7cddf1d-a2b6-4e60-80f6-1d5a07563c5c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c49604e-c0c7-417f-9298-de96727088ed, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=bde07ed9-3b87-4da1-9140-a5329259cd8a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.007 106085 INFO neutron.agent.ovn.metadata.agent [-] Port bde07ed9-3b87-4da1-9140-a5329259cd8a in datapath 18593015-aa72-4746-ac6a-5ce1ea63dc3d bound to our chassis
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.008 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.008 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 18593015-aa72-4746-ac6a-5ce1ea63dc3d
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.019 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[5259a904-b342-405d-801f-bc5bd30ea0ed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.019 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap18593015-a1 in ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.021 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap18593015-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.021 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[1de1614d-0a71-4a96-9bc8-2263b4fda9e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.022 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ddd3a7-30f4-4256-a46c-02856e400d13]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 systemd-machined[156136]: New machine qemu-9-instance-00000009.
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.031 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[9c61c642-7ccd-4a02-b623-ef610ed27c81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000009.
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.045 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[06bc4d97-ffde-4097-8553-e4314cb1349d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 systemd-udevd[219600]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:19:41 compute-0 NetworkManager[56537]: <info>  [1772212781.0660] device (tapbde07ed9-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:19:41 compute-0 NetworkManager[56537]: <info>  [1772212781.0669] device (tapbde07ed9-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.071 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[8c61e5d7-2ad4-44ad-8982-5e1b4afb867c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.078 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[832309b8-be2e-4b80-b8dd-93129931adc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 NetworkManager[56537]: <info>  [1772212781.0799] manager: (tap18593015-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.109 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[a40be273-5c4b-42f3-afcf-0d0b3cb14442]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.113 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[5f42d678-d067-43f2-beba-8e44550358a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 NetworkManager[56537]: <info>  [1772212781.1332] device (tap18593015-a0): carrier: link connected
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.137 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[186f4131-c577-4b94-bc03-87d26d18ff4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.149 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[105f416f-fa73-43ad-9891-c90b87d31445]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap18593015-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:d3:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365662, 'reachable_time': 43685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219630, 'error': None, 'target': 'ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.156 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[a14b7cc6-2269-4acb-af7d-86f6af847d67]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feef:d372'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 365662, 'tstamp': 365662}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219631, 'error': None, 'target': 'ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.165 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[ffbe0ded-8d8f-4556-b9b3-ed6be175ff3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap18593015-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ef:d3:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365662, 'reachable_time': 43685, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219632, 'error': None, 'target': 'ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.182 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[390cd558-1575-4fdc-9635-883af6593a16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.215 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[775b24d5-2d27-4beb-b0b7-ae4d1a5e293c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.216 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18593015-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.217 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.217 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap18593015-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.218 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:41 compute-0 kernel: tap18593015-a0: entered promiscuous mode
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.220 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.221 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap18593015-a0, col_values=(('external_ids', {'iface-id': 'abed0f28-89ab-4d2d-96b7-a971668f4f6e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.222 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:41 compute-0 NetworkManager[56537]: <info>  [1772212781.2225] manager: (tap18593015-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Feb 27 17:19:41 compute-0 ovn_controller[96756]: 2026-02-27T17:19:41Z|00122|binding|INFO|Releasing lport abed0f28-89ab-4d2d-96b7-a971668f4f6e from this chassis (sb_readonly=0)
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.229 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.229 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/18593015-aa72-4746-ac6a-5ce1ea63dc3d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/18593015-aa72-4746-ac6a-5ce1ea63dc3d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.230 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[822eceaf-1471-4156-95b4-d01a18f18bad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.231 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-18593015-aa72-4746-ac6a-5ce1ea63dc3d
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/18593015-aa72-4746-ac6a-5ce1ea63dc3d.pid.haproxy
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID 18593015-aa72-4746-ac6a-5ce1ea63dc3d
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:19:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:41.231 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'env', 'PROCESS_TAG=haproxy-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/18593015-aa72-4746-ac6a-5ce1ea63dc3d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.438 186844 DEBUG nova.compute.manager [req-d933e26d-dd5e-42be-a5dc-4e9685c74fb6 req-6b65bf8f-20a7-4023-9b8c-ec0be829e13d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Received event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.438 186844 DEBUG oslo_concurrency.lockutils [req-d933e26d-dd5e-42be-a5dc-4e9685c74fb6 req-6b65bf8f-20a7-4023-9b8c-ec0be829e13d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.438 186844 DEBUG oslo_concurrency.lockutils [req-d933e26d-dd5e-42be-a5dc-4e9685c74fb6 req-6b65bf8f-20a7-4023-9b8c-ec0be829e13d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.439 186844 DEBUG oslo_concurrency.lockutils [req-d933e26d-dd5e-42be-a5dc-4e9685c74fb6 req-6b65bf8f-20a7-4023-9b8c-ec0be829e13d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.439 186844 DEBUG nova.compute.manager [req-d933e26d-dd5e-42be-a5dc-4e9685c74fb6 req-6b65bf8f-20a7-4023-9b8c-ec0be829e13d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Processing event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.549 186844 DEBUG nova.network.neutron [req-03aee4b3-d5eb-44b6-a04a-680f45f9d1c6 req-98f973aa-2b40-4d73-9ff8-cc4df145aa7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Updated VIF entry in instance network info cache for port bde07ed9-3b87-4da1-9140-a5329259cd8a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.550 186844 DEBUG nova.network.neutron [req-03aee4b3-d5eb-44b6-a04a-680f45f9d1c6 req-98f973aa-2b40-4d73-9ff8-cc4df145aa7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Updating instance_info_cache with network_info: [{"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.569 186844 DEBUG oslo_concurrency.lockutils [req-03aee4b3-d5eb-44b6-a04a-680f45f9d1c6 req-98f973aa-2b40-4d73-9ff8-cc4df145aa7b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-f7b4ac29-bdcd-429a-b61c-01753b15d3da" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:19:41 compute-0 podman[219664]: 2026-02-27 17:19:41.629589018 +0000 UTC m=+0.098483024 container create ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223)
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.652 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212781.651665, f7b4ac29-bdcd-429a-b61c-01753b15d3da => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.652 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] VM Started (Lifecycle Event)
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.656 186844 DEBUG nova.compute.manager [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:19:41 compute-0 podman[219664]: 2026-02-27 17:19:41.563750574 +0000 UTC m=+0.032644620 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.660 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.663 186844 INFO nova.virt.libvirt.driver [-] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Instance spawned successfully.
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.663 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:19:41 compute-0 systemd[1]: Started libpod-conmon-ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f.scope.
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.671 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.684 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.689 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.690 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.690 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.691 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.692 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.693 186844 DEBUG nova.virt.libvirt.driver [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:19:41 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.701 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:19:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53426e7d7bb010e050fadc25da77209e69ba80d6701b519ad5e3b814c4a52c24/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.702 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212781.6518664, f7b4ac29-bdcd-429a-b61c-01753b15d3da => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.703 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] VM Paused (Lifecycle Event)
Feb 27 17:19:41 compute-0 podman[219664]: 2026-02-27 17:19:41.720879069 +0000 UTC m=+0.189773045 container init ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:19:41 compute-0 podman[219664]: 2026-02-27 17:19:41.727065096 +0000 UTC m=+0.195959072 container start ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.739 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.744 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212781.6596985, f7b4ac29-bdcd-429a-b61c-01753b15d3da => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.745 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] VM Resumed (Lifecycle Event)
Feb 27 17:19:41 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219686]: [NOTICE]   (219690) : New worker (219692) forked
Feb 27 17:19:41 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219686]: [NOTICE]   (219690) : Loading success.
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.781 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.787 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.794 186844 INFO nova.compute.manager [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Took 7.06 seconds to spawn the instance on the hypervisor.
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.794 186844 DEBUG nova.compute.manager [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.836 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.884 186844 INFO nova.compute.manager [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Took 7.65 seconds to build instance.
Feb 27 17:19:41 compute-0 nova_compute[186840]: 2026-02-27 17:19:41.900 186844 DEBUG oslo_concurrency.lockutils [None req-23279f3e-9a01-4d02-b4c3-c33f3a519551 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.403 186844 DEBUG oslo_concurrency.lockutils [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.405 186844 DEBUG oslo_concurrency.lockutils [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.405 186844 DEBUG oslo_concurrency.lockutils [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.405 186844 DEBUG oslo_concurrency.lockutils [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.406 186844 DEBUG oslo_concurrency.lockutils [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.407 186844 INFO nova.compute.manager [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Terminating instance
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.409 186844 DEBUG nova.compute.manager [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:19:43 compute-0 kernel: tapbde07ed9-3b (unregistering): left promiscuous mode
Feb 27 17:19:43 compute-0 NetworkManager[56537]: <info>  [1772212783.4261] device (tapbde07ed9-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.425 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:43 compute-0 ovn_controller[96756]: 2026-02-27T17:19:43Z|00123|binding|INFO|Releasing lport bde07ed9-3b87-4da1-9140-a5329259cd8a from this chassis (sb_readonly=0)
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.430 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:43 compute-0 ovn_controller[96756]: 2026-02-27T17:19:43Z|00124|binding|INFO|Setting lport bde07ed9-3b87-4da1-9140-a5329259cd8a down in Southbound
Feb 27 17:19:43 compute-0 ovn_controller[96756]: 2026-02-27T17:19:43Z|00125|binding|INFO|Removing iface tapbde07ed9-3b ovn-installed in OVS
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.432 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.435 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.438 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c5:4e:0b 10.100.0.5'], port_security=['fa:16:3e:c5:4e:0b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1464661183', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f7b4ac29-bdcd-429a-b61c-01753b15d3da', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1464661183', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'c7cddf1d-a2b6-4e60-80f6-1d5a07563c5c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.200', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c49604e-c0c7-417f-9298-de96727088ed, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=bde07ed9-3b87-4da1-9140-a5329259cd8a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.439 106085 INFO neutron.agent.ovn.metadata.agent [-] Port bde07ed9-3b87-4da1-9140-a5329259cd8a in datapath 18593015-aa72-4746-ac6a-5ce1ea63dc3d unbound from our chassis
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.440 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 18593015-aa72-4746-ac6a-5ce1ea63dc3d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.441 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[0389d51a-3007-436b-89ca-de345bfab0a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.441 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d namespace which is not needed anymore
Feb 27 17:19:43 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Deactivated successfully.
Feb 27 17:19:43 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000009.scope: Consumed 2.328s CPU time.
Feb 27 17:19:43 compute-0 systemd-machined[156136]: Machine qemu-9-instance-00000009 terminated.
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.514 186844 DEBUG nova.compute.manager [req-daf9d7d1-c4d3-421f-abc0-13090a13c40f req-0345f405-42f9-4955-b5be-4d945b7328a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Received event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.514 186844 DEBUG oslo_concurrency.lockutils [req-daf9d7d1-c4d3-421f-abc0-13090a13c40f req-0345f405-42f9-4955-b5be-4d945b7328a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.515 186844 DEBUG oslo_concurrency.lockutils [req-daf9d7d1-c4d3-421f-abc0-13090a13c40f req-0345f405-42f9-4955-b5be-4d945b7328a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.515 186844 DEBUG oslo_concurrency.lockutils [req-daf9d7d1-c4d3-421f-abc0-13090a13c40f req-0345f405-42f9-4955-b5be-4d945b7328a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.515 186844 DEBUG nova.compute.manager [req-daf9d7d1-c4d3-421f-abc0-13090a13c40f req-0345f405-42f9-4955-b5be-4d945b7328a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] No waiting events found dispatching network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.516 186844 WARNING nova.compute.manager [req-daf9d7d1-c4d3-421f-abc0-13090a13c40f req-0345f405-42f9-4955-b5be-4d945b7328a0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Received unexpected event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a for instance with vm_state active and task_state deleting.
Feb 27 17:19:43 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219686]: [NOTICE]   (219690) : haproxy version is 2.8.14-c23fe91
Feb 27 17:19:43 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219686]: [NOTICE]   (219690) : path to executable is /usr/sbin/haproxy
Feb 27 17:19:43 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219686]: [WARNING]  (219690) : Exiting Master process...
Feb 27 17:19:43 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219686]: [WARNING]  (219690) : Exiting Master process...
Feb 27 17:19:43 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219686]: [ALERT]    (219690) : Current worker (219692) exited with code 143 (Terminated)
Feb 27 17:19:43 compute-0 neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d[219686]: [WARNING]  (219690) : All workers exited. Exiting... (0)
Feb 27 17:19:43 compute-0 systemd[1]: libpod-ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f.scope: Deactivated successfully.
Feb 27 17:19:43 compute-0 podman[219723]: 2026-02-27 17:19:43.576614515 +0000 UTC m=+0.047355315 container died ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:19:43 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f-userdata-shm.mount: Deactivated successfully.
Feb 27 17:19:43 compute-0 systemd[1]: var-lib-containers-storage-overlay-53426e7d7bb010e050fadc25da77209e69ba80d6701b519ad5e3b814c4a52c24-merged.mount: Deactivated successfully.
Feb 27 17:19:43 compute-0 podman[219723]: 2026-02-27 17:19:43.632435884 +0000 UTC m=+0.103176674 container cleanup ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:19:43 compute-0 systemd[1]: libpod-conmon-ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f.scope: Deactivated successfully.
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.660 186844 INFO nova.virt.libvirt.driver [-] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Instance destroyed successfully.
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.661 186844 DEBUG nova.objects.instance [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid f7b4ac29-bdcd-429a-b61c-01753b15d3da obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.678 186844 DEBUG nova.virt.libvirt.vif [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:19:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1706788226',display_name='tempest-TestNetworkBasicOps-server-1706788226',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1706788226',id=9,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLkLitbV4uK7efSmejh8tHcMQu8yY8NlEPUnPj0a15bPHKkOfUWHmQ7HvadZeaRdQXGWV8mf/kYIMab2EzfSGP5qCilcsn5EfXSrXwFx1fVhUcK2RWGBg3iqzsVoeTsivw==',key_name='tempest-TestNetworkBasicOps-1855572024',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:19:41Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-sry3uzop',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:19:41Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=f7b4ac29-bdcd-429a-b61c-01753b15d3da,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.678 186844 DEBUG nova.network.os_vif_util [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "address": "fa:16:3e:c5:4e:0b", "network": {"id": "18593015-aa72-4746-ac6a-5ce1ea63dc3d", "bridge": "br-int", "label": "tempest-network-smoke--786082759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbde07ed9-3b", "ovs_interfaceid": "bde07ed9-3b87-4da1-9140-a5329259cd8a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.679 186844 DEBUG nova.network.os_vif_util [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.680 186844 DEBUG os_vif [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.681 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.682 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbde07ed9-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.683 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.684 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.686 186844 INFO os_vif [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c5:4e:0b,bridge_name='br-int',has_traffic_filtering=True,id=bde07ed9-3b87-4da1-9140-a5329259cd8a,network=Network(18593015-aa72-4746-ac6a-5ce1ea63dc3d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapbde07ed9-3b')
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.687 186844 INFO nova.virt.libvirt.driver [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Deleting instance files /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da_del
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.687 186844 INFO nova.virt.libvirt.driver [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Deletion of /var/lib/nova/instances/f7b4ac29-bdcd-429a-b61c-01753b15d3da_del complete
Feb 27 17:19:43 compute-0 podman[219763]: 2026-02-27 17:19:43.695367744 +0000 UTC m=+0.042633785 container remove ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223)
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.698 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[1dc5e6bb-f619-4331-bd84-f0ee809efa10]: (4, ('Fri Feb 27 05:19:43 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d (ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f)\nec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f\nFri Feb 27 05:19:43 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d (ec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f)\nec55c3a607384f989c0c5ee237308f35568404d3a8b3128bd13db3058b93689f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.700 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[159a9b67-49de-47ea-89fb-d3dc69171917]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.701 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18593015-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.702 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:43 compute-0 kernel: tap18593015-a0: left promiscuous mode
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.710 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.714 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[86c9cc6b-167e-4167-95a0-0781d2cb69e7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.727 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[b1235437-e189-4771-80b2-ee15a2a96e46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.728 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[d43a2ed4-ab83-4d3e-8afa-84da012cb82e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.739 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[5b4b2ed1-9c4d-480f-8b7e-a5a00ef489ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 365655, 'reachable_time': 29988, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219785, 'error': None, 'target': 'ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.742 186844 INFO nova.compute.manager [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Took 0.33 seconds to destroy the instance on the hypervisor.
Feb 27 17:19:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d18593015\x2daa72\x2d4746\x2dac6a\x2d5ce1ea63dc3d.mount: Deactivated successfully.
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.741 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-18593015-aa72-4746-ac6a-5ce1ea63dc3d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:19:43 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:43.742 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[f83c20cd-c7d3-4eed-83fb-db2559ce2ffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.742 186844 DEBUG oslo.service.loopingcall [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.743 186844 DEBUG nova.compute.manager [-] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:19:43 compute-0 nova_compute[186840]: 2026-02-27 17:19:43.744 186844 DEBUG nova.network.neutron [-] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.013 186844 DEBUG nova.network.neutron [-] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.036 186844 INFO nova.compute.manager [-] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Took 1.29 seconds to deallocate network for instance.
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.089 186844 DEBUG oslo_concurrency.lockutils [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.090 186844 DEBUG oslo_concurrency.lockutils [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.149 186844 DEBUG nova.compute.provider_tree [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.168 186844 DEBUG nova.scheduler.client.report [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.199 186844 DEBUG oslo_concurrency.lockutils [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.221 186844 INFO nova.scheduler.client.report [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance f7b4ac29-bdcd-429a-b61c-01753b15d3da
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.300 186844 DEBUG oslo_concurrency.lockutils [None req-9164c3e1-86b0-4c2b-8d05-899d807ec0f9 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.896s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.631 186844 DEBUG nova.compute.manager [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Received event network-vif-unplugged-bde07ed9-3b87-4da1-9140-a5329259cd8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.631 186844 DEBUG oslo_concurrency.lockutils [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.632 186844 DEBUG oslo_concurrency.lockutils [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.632 186844 DEBUG oslo_concurrency.lockutils [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.632 186844 DEBUG nova.compute.manager [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] No waiting events found dispatching network-vif-unplugged-bde07ed9-3b87-4da1-9140-a5329259cd8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.632 186844 WARNING nova.compute.manager [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Received unexpected event network-vif-unplugged-bde07ed9-3b87-4da1-9140-a5329259cd8a for instance with vm_state deleted and task_state None.
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.632 186844 DEBUG nova.compute.manager [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Received event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.633 186844 DEBUG oslo_concurrency.lockutils [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.633 186844 DEBUG oslo_concurrency.lockutils [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.633 186844 DEBUG oslo_concurrency.lockutils [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "f7b4ac29-bdcd-429a-b61c-01753b15d3da-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.633 186844 DEBUG nova.compute.manager [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] No waiting events found dispatching network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.634 186844 WARNING nova.compute.manager [req-a2215d1f-0372-41bb-9072-9b6f1ab3db89 req-b5488033-1b09-44b8-bc2d-cc4261eff42f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Received unexpected event network-vif-plugged-bde07ed9-3b87-4da1-9140-a5329259cd8a for instance with vm_state deleted and task_state None.
Feb 27 17:19:45 compute-0 podman[219786]: 2026-02-27 17:19:45.709759935 +0000 UTC m=+0.103828581 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, vcs-type=git)
Feb 27 17:19:45 compute-0 podman[219787]: 2026-02-27 17:19:45.71271981 +0000 UTC m=+0.109118665 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 27 17:19:45 compute-0 nova_compute[186840]: 2026-02-27 17:19:45.721 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:47.093 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:19:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:47.094 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:19:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:19:47.094 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:19:48 compute-0 nova_compute[186840]: 2026-02-27 17:19:48.683 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:50 compute-0 nova_compute[186840]: 2026-02-27 17:19:50.722 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:51 compute-0 nova_compute[186840]: 2026-02-27 17:19:51.518 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:51 compute-0 nova_compute[186840]: 2026-02-27 17:19:51.554 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:53 compute-0 nova_compute[186840]: 2026-02-27 17:19:53.685 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:54 compute-0 podman[219834]: 2026-02-27 17:19:54.678711293 +0000 UTC m=+0.077250394 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:19:55 compute-0 nova_compute[186840]: 2026-02-27 17:19:55.724 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:58 compute-0 nova_compute[186840]: 2026-02-27 17:19:58.659 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212783.6572664, f7b4ac29-bdcd-429a-b61c-01753b15d3da => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:19:58 compute-0 nova_compute[186840]: 2026-02-27 17:19:58.659 186844 INFO nova.compute.manager [-] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] VM Stopped (Lifecycle Event)
Feb 27 17:19:58 compute-0 nova_compute[186840]: 2026-02-27 17:19:58.687 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:19:58 compute-0 nova_compute[186840]: 2026-02-27 17:19:58.690 186844 DEBUG nova.compute.manager [None req-ac7f440f-517c-4ec1-b77f-dc7818bb5fdb - - - - - -] [instance: f7b4ac29-bdcd-429a-b61c-01753b15d3da] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:19:59 compute-0 podman[219854]: 2026-02-27 17:19:59.658423919 +0000 UTC m=+0.062013018 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:20:00 compute-0 nova_compute[186840]: 2026-02-27 17:20:00.726 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:03 compute-0 nova_compute[186840]: 2026-02-27 17:20:03.727 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:20:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:20:05 compute-0 nova_compute[186840]: 2026-02-27 17:20:05.729 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:08 compute-0 nova_compute[186840]: 2026-02-27 17:20:08.728 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:10 compute-0 nova_compute[186840]: 2026-02-27 17:20:10.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:20:10 compute-0 nova_compute[186840]: 2026-02-27 17:20:10.732 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:11 compute-0 podman[219879]: 2026-02-27 17:20:11.645324628 +0000 UTC m=+0.052897976 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:20:11 compute-0 podman[219880]: 2026-02-27 17:20:11.675095615 +0000 UTC m=+0.074422133 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.383 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.383 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.403 186844 DEBUG nova.compute.manager [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.495 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.495 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.507 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.507 186844 INFO nova.compute.claims [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.660 186844 DEBUG nova.compute.provider_tree [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.679 186844 DEBUG nova.scheduler.client.report [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.704 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.705 186844 DEBUG nova.compute.manager [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.765 186844 DEBUG nova.compute.manager [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.766 186844 DEBUG nova.network.neutron [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.790 186844 INFO nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.812 186844 DEBUG nova.compute.manager [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.966 186844 DEBUG nova.compute.manager [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.968 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.968 186844 INFO nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Creating image(s)
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.969 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.969 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.970 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:12 compute-0 nova_compute[186840]: 2026-02-27 17:20:12.993 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.068 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.070 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.071 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.093 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.155 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.156 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.191 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.192 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.193 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.236 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.238 186844 DEBUG nova.virt.disk.api [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.238 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.311 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.312 186844 DEBUG nova.virt.disk.api [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.313 186844 DEBUG nova.objects.instance [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid 45c48dce-50d6-45f9-98c5-85fa5eb52b61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.334 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.335 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Ensure instance console log exists: /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.335 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.336 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.336 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.731 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.732 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.732 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.732 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.733 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.878 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.880 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5754MB free_disk=73.19372940063477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.880 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.880 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:13 compute-0 nova_compute[186840]: 2026-02-27 17:20:13.977 186844 DEBUG nova.policy [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:20:14 compute-0 nova_compute[186840]: 2026-02-27 17:20:14.001 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Instance 45c48dce-50d6-45f9-98c5-85fa5eb52b61 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 27 17:20:14 compute-0 nova_compute[186840]: 2026-02-27 17:20:14.002 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:20:14 compute-0 nova_compute[186840]: 2026-02-27 17:20:14.003 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:20:14 compute-0 nova_compute[186840]: 2026-02-27 17:20:14.067 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:20:14 compute-0 nova_compute[186840]: 2026-02-27 17:20:14.089 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:20:14 compute-0 nova_compute[186840]: 2026-02-27 17:20:14.116 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:20:14 compute-0 nova_compute[186840]: 2026-02-27 17:20:14.116 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.236s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:15 compute-0 nova_compute[186840]: 2026-02-27 17:20:15.117 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:20:15 compute-0 nova_compute[186840]: 2026-02-27 17:20:15.118 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:20:15 compute-0 nova_compute[186840]: 2026-02-27 17:20:15.118 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:20:15 compute-0 nova_compute[186840]: 2026-02-27 17:20:15.136 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Feb 27 17:20:15 compute-0 nova_compute[186840]: 2026-02-27 17:20:15.136 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:20:15 compute-0 nova_compute[186840]: 2026-02-27 17:20:15.137 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:20:15 compute-0 nova_compute[186840]: 2026-02-27 17:20:15.733 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:16 compute-0 nova_compute[186840]: 2026-02-27 17:20:16.017 186844 DEBUG nova.network.neutron [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Successfully created port: 2ac9a504-9a49-47ff-a072-fc80d6b7f534 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:20:16 compute-0 podman[219938]: 2026-02-27 17:20:16.673040534 +0000 UTC m=+0.074695640 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Feb 27 17:20:16 compute-0 podman[219939]: 2026-02-27 17:20:16.702314228 +0000 UTC m=+0.098861254 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 27 17:20:16 compute-0 nova_compute[186840]: 2026-02-27 17:20:16.714 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:20:17 compute-0 nova_compute[186840]: 2026-02-27 17:20:17.671 186844 DEBUG nova.network.neutron [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Successfully updated port: 2ac9a504-9a49-47ff-a072-fc80d6b7f534 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:20:17 compute-0 nova_compute[186840]: 2026-02-27 17:20:17.690 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:20:17 compute-0 nova_compute[186840]: 2026-02-27 17:20:17.690 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:20:17 compute-0 nova_compute[186840]: 2026-02-27 17:20:17.691 186844 DEBUG nova.network.neutron [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:20:17 compute-0 nova_compute[186840]: 2026-02-27 17:20:17.697 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:20:17 compute-0 nova_compute[186840]: 2026-02-27 17:20:17.775 186844 DEBUG nova.compute.manager [req-82edbe93-1fef-4b5f-a846-9c8cf312ae8f req-02414872-72bc-4f79-885d-f4ec5b9df412 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-changed-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:17 compute-0 nova_compute[186840]: 2026-02-27 17:20:17.776 186844 DEBUG nova.compute.manager [req-82edbe93-1fef-4b5f-a846-9c8cf312ae8f req-02414872-72bc-4f79-885d-f4ec5b9df412 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Refreshing instance network info cache due to event network-changed-2ac9a504-9a49-47ff-a072-fc80d6b7f534. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:20:17 compute-0 nova_compute[186840]: 2026-02-27 17:20:17.776 186844 DEBUG oslo_concurrency.lockutils [req-82edbe93-1fef-4b5f-a846-9c8cf312ae8f req-02414872-72bc-4f79-885d-f4ec5b9df412 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:20:17 compute-0 nova_compute[186840]: 2026-02-27 17:20:17.831 186844 DEBUG nova.network.neutron [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:20:18 compute-0 nova_compute[186840]: 2026-02-27 17:20:18.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:20:18 compute-0 nova_compute[186840]: 2026-02-27 17:20:18.733 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.050 186844 DEBUG nova.network.neutron [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Updating instance_info_cache with network_info: [{"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.072 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.073 186844 DEBUG nova.compute.manager [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Instance network_info: |[{"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.073 186844 DEBUG oslo_concurrency.lockutils [req-82edbe93-1fef-4b5f-a846-9c8cf312ae8f req-02414872-72bc-4f79-885d-f4ec5b9df412 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.074 186844 DEBUG nova.network.neutron [req-82edbe93-1fef-4b5f-a846-9c8cf312ae8f req-02414872-72bc-4f79-885d-f4ec5b9df412 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Refreshing network info cache for port 2ac9a504-9a49-47ff-a072-fc80d6b7f534 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.077 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Start _get_guest_xml network_info=[{"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.082 186844 WARNING nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.090 186844 DEBUG nova.virt.libvirt.host [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.091 186844 DEBUG nova.virt.libvirt.host [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.099 186844 DEBUG nova.virt.libvirt.host [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.100 186844 DEBUG nova.virt.libvirt.host [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.100 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.100 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.101 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.101 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.101 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.102 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.102 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.102 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.103 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.103 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.103 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.104 186844 DEBUG nova.virt.hardware [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.108 186844 DEBUG nova.virt.libvirt.vif [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:20:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1948732682',display_name='tempest-TestNetworkBasicOps-server-1948732682',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1948732682',id=10,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAZpMH2svT+uPhWoBoRCUdLAhaSIdrh9zxtS+bFDvUoYUDLQ5r0uxCjzhdN3tdqzUb9iWTNBRIUPXIzn3UG2fcwjEESz0IsmWqXCuZE0BFT2xT4N0+0NLKCZYOFGG/IThg==',key_name='tempest-TestNetworkBasicOps-1690156444',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-wqtbnqx0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:20:12Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=45c48dce-50d6-45f9-98c5-85fa5eb52b61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.109 186844 DEBUG nova.network.os_vif_util [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.110 186844 DEBUG nova.network.os_vif_util [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:ca:75,bridge_name='br-int',has_traffic_filtering=True,id=2ac9a504-9a49-47ff-a072-fc80d6b7f534,network=Network(f0ba36c2-bb45-4368-b6ae-92ebcb190139),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ac9a504-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.111 186844 DEBUG nova.objects.instance [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 45c48dce-50d6-45f9-98c5-85fa5eb52b61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.129 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:20:19 compute-0 nova_compute[186840]:   <uuid>45c48dce-50d6-45f9-98c5-85fa5eb52b61</uuid>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   <name>instance-0000000a</name>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1948732682</nova:name>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:20:19</nova:creationTime>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:20:19 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:20:19 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:20:19 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:20:19 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:20:19 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:20:19 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:20:19 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:20:19 compute-0 nova_compute[186840]:         <nova:port uuid="2ac9a504-9a49-47ff-a072-fc80d6b7f534">
Feb 27 17:20:19 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <system>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <entry name="serial">45c48dce-50d6-45f9-98c5-85fa5eb52b61</entry>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <entry name="uuid">45c48dce-50d6-45f9-98c5-85fa5eb52b61</entry>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     </system>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   <os>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   </os>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   <features>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   </features>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk.config"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:51:ca:75"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <target dev="tap2ac9a504-9a"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/console.log" append="off"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <video>
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     </video>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:20:19 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:20:19 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:20:19 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:20:19 compute-0 nova_compute[186840]: </domain>
Feb 27 17:20:19 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.130 186844 DEBUG nova.compute.manager [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Preparing to wait for external event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.131 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.131 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.131 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.132 186844 DEBUG nova.virt.libvirt.vif [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:20:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1948732682',display_name='tempest-TestNetworkBasicOps-server-1948732682',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1948732682',id=10,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAZpMH2svT+uPhWoBoRCUdLAhaSIdrh9zxtS+bFDvUoYUDLQ5r0uxCjzhdN3tdqzUb9iWTNBRIUPXIzn3UG2fcwjEESz0IsmWqXCuZE0BFT2xT4N0+0NLKCZYOFGG/IThg==',key_name='tempest-TestNetworkBasicOps-1690156444',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-wqtbnqx0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:20:12Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=45c48dce-50d6-45f9-98c5-85fa5eb52b61,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.133 186844 DEBUG nova.network.os_vif_util [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.134 186844 DEBUG nova.network.os_vif_util [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:51:ca:75,bridge_name='br-int',has_traffic_filtering=True,id=2ac9a504-9a49-47ff-a072-fc80d6b7f534,network=Network(f0ba36c2-bb45-4368-b6ae-92ebcb190139),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ac9a504-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.134 186844 DEBUG os_vif [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:ca:75,bridge_name='br-int',has_traffic_filtering=True,id=2ac9a504-9a49-47ff-a072-fc80d6b7f534,network=Network(f0ba36c2-bb45-4368-b6ae-92ebcb190139),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ac9a504-9a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.135 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.136 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.136 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.141 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.141 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2ac9a504-9a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.141 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2ac9a504-9a, col_values=(('external_ids', {'iface-id': '2ac9a504-9a49-47ff-a072-fc80d6b7f534', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:51:ca:75', 'vm-uuid': '45c48dce-50d6-45f9-98c5-85fa5eb52b61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.143 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:19 compute-0 NetworkManager[56537]: <info>  [1772212819.1445] manager: (tap2ac9a504-9a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.146 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.149 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.150 186844 INFO os_vif [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:51:ca:75,bridge_name='br-int',has_traffic_filtering=True,id=2ac9a504-9a49-47ff-a072-fc80d6b7f534,network=Network(f0ba36c2-bb45-4368-b6ae-92ebcb190139),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ac9a504-9a')
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.222 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.222 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.223 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:51:ca:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.223 186844 INFO nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Using config drive
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.946 186844 INFO nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Creating config drive at /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk.config
Feb 27 17:20:19 compute-0 nova_compute[186840]: 2026-02-27 17:20:19.952 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpnm0nmqo0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.085 186844 DEBUG oslo_concurrency.processutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpnm0nmqo0" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:20:20 compute-0 kernel: tap2ac9a504-9a: entered promiscuous mode
Feb 27 17:20:20 compute-0 NetworkManager[56537]: <info>  [1772212820.1445] manager: (tap2ac9a504-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.145 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:20 compute-0 ovn_controller[96756]: 2026-02-27T17:20:20Z|00126|binding|INFO|Claiming lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 for this chassis.
Feb 27 17:20:20 compute-0 ovn_controller[96756]: 2026-02-27T17:20:20Z|00127|binding|INFO|2ac9a504-9a49-47ff-a072-fc80d6b7f534: Claiming fa:16:3e:51:ca:75 10.100.0.11
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.159 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:20 compute-0 systemd-udevd[220005]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.169 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:ca:75 10.100.0.11'], port_security=['fa:16:3e:51:ca:75 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '45c48dce-50d6-45f9-98c5-85fa5eb52b61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0ba36c2-bb45-4368-b6ae-92ebcb190139', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '39cbff6a-15d0-47da-8b62-9565d0d65b72', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec70ccd9-966e-4a90-92cd-d89fd669fdb9, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=2ac9a504-9a49-47ff-a072-fc80d6b7f534) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.170 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 2ac9a504-9a49-47ff-a072-fc80d6b7f534 in datapath f0ba36c2-bb45-4368-b6ae-92ebcb190139 bound to our chassis
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.171 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f0ba36c2-bb45-4368-b6ae-92ebcb190139
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.179 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[37f2a5cf-0779-4d6f-8d6a-11ebb5df9073]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.180 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf0ba36c2-b1 in ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:20:20 compute-0 NetworkManager[56537]: <info>  [1772212820.1818] device (tap2ac9a504-9a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.182 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf0ba36c2-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.182 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3f7e7cde-ac4e-4e5a-ad33-46e3299edb30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 NetworkManager[56537]: <info>  [1772212820.1833] device (tap2ac9a504-9a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.183 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[510bc0e3-4635-4bd7-a90b-55d9db556999]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 systemd-machined[156136]: New machine qemu-10-instance-0000000a.
Feb 27 17:20:20 compute-0 ovn_controller[96756]: 2026-02-27T17:20:20Z|00128|binding|INFO|Setting lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 ovn-installed in OVS
Feb 27 17:20:20 compute-0 ovn_controller[96756]: 2026-02-27T17:20:20Z|00129|binding|INFO|Setting lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 up in Southbound
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.194 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.196 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[ff7f15be-416b-480a-a23c-78fdff54f98e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000a.
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.205 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[aa20baf5-7bc8-4d9e-aa71-329da7e03adc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.223 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[59885166-33ea-43e7-a4f3-3ac3711562f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 NetworkManager[56537]: <info>  [1772212820.2288] manager: (tapf0ba36c2-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.229 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[821c3ecb-84c1-4551-b74e-4d516c4b8fdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.255 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[94094a93-602a-4ef9-8911-825b2a374bad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.258 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[16d75ebc-332e-42b6-89bc-87f8d3ff6a03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 NetworkManager[56537]: <info>  [1772212820.2768] device (tapf0ba36c2-b0): carrier: link connected
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.282 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[5c355477-cf08-4346-9cf4-17a9930ca42c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.298 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[0e30ae79-ec38-4487-979f-e6c87f6cfc14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0ba36c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:12:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369576, 'reachable_time': 22394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220039, 'error': None, 'target': 'ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.311 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[edf0b181-e6c4-4141-af9a-869165916dd7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe03:124f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 369576, 'tstamp': 369576}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220040, 'error': None, 'target': 'ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.324 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[87f1a22f-cd2f-4bdc-ab0b-2b712c8fe951]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0ba36c2-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:03:12:4f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369576, 'reachable_time': 22394, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220041, 'error': None, 'target': 'ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.351 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[f2289b98-6e40-4acc-a1fe-305ba2b2d901]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.404 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[c2167e61-e565-492e-b33b-9706990928c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.405 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0ba36c2-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.406 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.406 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0ba36c2-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.408 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:20 compute-0 NetworkManager[56537]: <info>  [1772212820.4095] manager: (tapf0ba36c2-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Feb 27 17:20:20 compute-0 kernel: tapf0ba36c2-b0: entered promiscuous mode
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.414 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf0ba36c2-b0, col_values=(('external_ids', {'iface-id': '1c984110-a28b-4af6-aaf8-e13a1bde0740'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.415 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:20 compute-0 ovn_controller[96756]: 2026-02-27T17:20:20Z|00130|binding|INFO|Releasing lport 1c984110-a28b-4af6-aaf8-e13a1bde0740 from this chassis (sb_readonly=0)
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.417 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f0ba36c2-bb45-4368-b6ae-92ebcb190139.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f0ba36c2-bb45-4368-b6ae-92ebcb190139.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.418 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[b3fb3e3f-df54-44eb-8817-3fca99eb868c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.418 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-f0ba36c2-bb45-4368-b6ae-92ebcb190139
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/f0ba36c2-bb45-4368-b6ae-92ebcb190139.pid.haproxy
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID f0ba36c2-bb45-4368-b6ae-92ebcb190139
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:20:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:20.420 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139', 'env', 'PROCESS_TAG=haproxy-f0ba36c2-bb45-4368-b6ae-92ebcb190139', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f0ba36c2-bb45-4368-b6ae-92ebcb190139.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.420 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.452 186844 DEBUG nova.compute.manager [req-0c7db3f6-23f3-4bf2-97e7-d491c1963934 req-3eddc4c8-2bbe-44d7-a256-aff30768c97f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.453 186844 DEBUG oslo_concurrency.lockutils [req-0c7db3f6-23f3-4bf2-97e7-d491c1963934 req-3eddc4c8-2bbe-44d7-a256-aff30768c97f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.454 186844 DEBUG oslo_concurrency.lockutils [req-0c7db3f6-23f3-4bf2-97e7-d491c1963934 req-3eddc4c8-2bbe-44d7-a256-aff30768c97f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.454 186844 DEBUG oslo_concurrency.lockutils [req-0c7db3f6-23f3-4bf2-97e7-d491c1963934 req-3eddc4c8-2bbe-44d7-a256-aff30768c97f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.455 186844 DEBUG nova.compute.manager [req-0c7db3f6-23f3-4bf2-97e7-d491c1963934 req-3eddc4c8-2bbe-44d7-a256-aff30768c97f 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Processing event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.537 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212820.5369961, 45c48dce-50d6-45f9-98c5-85fa5eb52b61 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.538 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] VM Started (Lifecycle Event)
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.540 186844 DEBUG nova.compute.manager [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.547 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.551 186844 INFO nova.virt.libvirt.driver [-] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Instance spawned successfully.
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.551 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.572 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.577 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.580 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.581 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.581 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.581 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.582 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.582 186844 DEBUG nova.virt.libvirt.driver [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.616 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.616 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212820.537878, 45c48dce-50d6-45f9-98c5-85fa5eb52b61 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.616 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] VM Paused (Lifecycle Event)
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.655 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.658 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212820.5466795, 45c48dce-50d6-45f9-98c5-85fa5eb52b61 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.659 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] VM Resumed (Lifecycle Event)
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.681 186844 INFO nova.compute.manager [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Took 7.71 seconds to spawn the instance on the hypervisor.
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.682 186844 DEBUG nova.compute.manager [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.722 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.725 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.764 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.766 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.777 186844 INFO nova.compute.manager [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Took 8.32 seconds to build instance.
Feb 27 17:20:20 compute-0 nova_compute[186840]: 2026-02-27 17:20:20.796 186844 DEBUG oslo_concurrency.lockutils [None req-8402abd0-a3bf-4d4a-a507-89f780b39d53 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:20 compute-0 podman[220080]: 2026-02-27 17:20:20.880319171 +0000 UTC m=+0.102119286 container create 41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2)
Feb 27 17:20:20 compute-0 podman[220080]: 2026-02-27 17:20:20.8098643 +0000 UTC m=+0.031664345 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:20:20 compute-0 systemd[1]: Started libpod-conmon-41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141.scope.
Feb 27 17:20:20 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:20:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10e031c26218337305968741a63c60047ee362b1d500d97e93710bd717c6a9ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:20:20 compute-0 podman[220080]: 2026-02-27 17:20:20.986668555 +0000 UTC m=+0.208468600 container init 41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 27 17:20:20 compute-0 podman[220080]: 2026-02-27 17:20:20.991716063 +0000 UTC m=+0.213516058 container start 41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS)
Feb 27 17:20:21 compute-0 neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139[220095]: [NOTICE]   (220099) : New worker (220101) forked
Feb 27 17:20:21 compute-0 neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139[220095]: [NOTICE]   (220099) : Loading success.
Feb 27 17:20:21 compute-0 nova_compute[186840]: 2026-02-27 17:20:21.254 186844 DEBUG nova.network.neutron [req-82edbe93-1fef-4b5f-a846-9c8cf312ae8f req-02414872-72bc-4f79-885d-f4ec5b9df412 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Updated VIF entry in instance network info cache for port 2ac9a504-9a49-47ff-a072-fc80d6b7f534. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:20:21 compute-0 nova_compute[186840]: 2026-02-27 17:20:21.254 186844 DEBUG nova.network.neutron [req-82edbe93-1fef-4b5f-a846-9c8cf312ae8f req-02414872-72bc-4f79-885d-f4ec5b9df412 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Updating instance_info_cache with network_info: [{"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:20:21 compute-0 nova_compute[186840]: 2026-02-27 17:20:21.277 186844 DEBUG oslo_concurrency.lockutils [req-82edbe93-1fef-4b5f-a846-9c8cf312ae8f req-02414872-72bc-4f79-885d-f4ec5b9df412 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:20:22 compute-0 nova_compute[186840]: 2026-02-27 17:20:22.576 186844 DEBUG nova.compute.manager [req-4ef855ab-5b86-4fba-8d0e-8e02eadaef2f req-c694ce96-15f8-425b-817d-ac70c0b0e399 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:22 compute-0 nova_compute[186840]: 2026-02-27 17:20:22.578 186844 DEBUG oslo_concurrency.lockutils [req-4ef855ab-5b86-4fba-8d0e-8e02eadaef2f req-c694ce96-15f8-425b-817d-ac70c0b0e399 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:22 compute-0 nova_compute[186840]: 2026-02-27 17:20:22.579 186844 DEBUG oslo_concurrency.lockutils [req-4ef855ab-5b86-4fba-8d0e-8e02eadaef2f req-c694ce96-15f8-425b-817d-ac70c0b0e399 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:22 compute-0 nova_compute[186840]: 2026-02-27 17:20:22.579 186844 DEBUG oslo_concurrency.lockutils [req-4ef855ab-5b86-4fba-8d0e-8e02eadaef2f req-c694ce96-15f8-425b-817d-ac70c0b0e399 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:22 compute-0 nova_compute[186840]: 2026-02-27 17:20:22.580 186844 DEBUG nova.compute.manager [req-4ef855ab-5b86-4fba-8d0e-8e02eadaef2f req-c694ce96-15f8-425b-817d-ac70c0b0e399 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] No waiting events found dispatching network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:20:22 compute-0 nova_compute[186840]: 2026-02-27 17:20:22.580 186844 WARNING nova.compute.manager [req-4ef855ab-5b86-4fba-8d0e-8e02eadaef2f req-c694ce96-15f8-425b-817d-ac70c0b0e399 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received unexpected event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 for instance with vm_state active and task_state None.
Feb 27 17:20:24 compute-0 nova_compute[186840]: 2026-02-27 17:20:24.194 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:24 compute-0 nova_compute[186840]: 2026-02-27 17:20:24.883 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:24 compute-0 NetworkManager[56537]: <info>  [1772212824.8847] manager: (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Feb 27 17:20:24 compute-0 NetworkManager[56537]: <info>  [1772212824.8859] manager: (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Feb 27 17:20:24 compute-0 nova_compute[186840]: 2026-02-27 17:20:24.892 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:24 compute-0 ovn_controller[96756]: 2026-02-27T17:20:24Z|00131|binding|INFO|Releasing lport 1c984110-a28b-4af6-aaf8-e13a1bde0740 from this chassis (sb_readonly=0)
Feb 27 17:20:24 compute-0 nova_compute[186840]: 2026-02-27 17:20:24.903 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:25 compute-0 podman[220111]: 2026-02-27 17:20:25.657201228 +0000 UTC m=+0.065565598 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Feb 27 17:20:25 compute-0 nova_compute[186840]: 2026-02-27 17:20:25.766 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:26 compute-0 nova_compute[186840]: 2026-02-27 17:20:26.147 186844 DEBUG nova.compute.manager [req-305a38b1-899e-40bb-8d87-ef8dcfc5e117 req-fb62c361-2dbb-4a88-8d93-1ce44700c9e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-changed-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:26 compute-0 nova_compute[186840]: 2026-02-27 17:20:26.147 186844 DEBUG nova.compute.manager [req-305a38b1-899e-40bb-8d87-ef8dcfc5e117 req-fb62c361-2dbb-4a88-8d93-1ce44700c9e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Refreshing instance network info cache due to event network-changed-2ac9a504-9a49-47ff-a072-fc80d6b7f534. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:20:26 compute-0 nova_compute[186840]: 2026-02-27 17:20:26.148 186844 DEBUG oslo_concurrency.lockutils [req-305a38b1-899e-40bb-8d87-ef8dcfc5e117 req-fb62c361-2dbb-4a88-8d93-1ce44700c9e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:20:26 compute-0 nova_compute[186840]: 2026-02-27 17:20:26.148 186844 DEBUG oslo_concurrency.lockutils [req-305a38b1-899e-40bb-8d87-ef8dcfc5e117 req-fb62c361-2dbb-4a88-8d93-1ce44700c9e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:20:26 compute-0 nova_compute[186840]: 2026-02-27 17:20:26.148 186844 DEBUG nova.network.neutron [req-305a38b1-899e-40bb-8d87-ef8dcfc5e117 req-fb62c361-2dbb-4a88-8d93-1ce44700c9e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Refreshing network info cache for port 2ac9a504-9a49-47ff-a072-fc80d6b7f534 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:20:28 compute-0 nova_compute[186840]: 2026-02-27 17:20:28.240 186844 DEBUG nova.network.neutron [req-305a38b1-899e-40bb-8d87-ef8dcfc5e117 req-fb62c361-2dbb-4a88-8d93-1ce44700c9e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Updated VIF entry in instance network info cache for port 2ac9a504-9a49-47ff-a072-fc80d6b7f534. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:20:28 compute-0 nova_compute[186840]: 2026-02-27 17:20:28.241 186844 DEBUG nova.network.neutron [req-305a38b1-899e-40bb-8d87-ef8dcfc5e117 req-fb62c361-2dbb-4a88-8d93-1ce44700c9e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Updating instance_info_cache with network_info: [{"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:20:28 compute-0 nova_compute[186840]: 2026-02-27 17:20:28.293 186844 DEBUG oslo_concurrency.lockutils [req-305a38b1-899e-40bb-8d87-ef8dcfc5e117 req-fb62c361-2dbb-4a88-8d93-1ce44700c9e5 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:20:29 compute-0 nova_compute[186840]: 2026-02-27 17:20:29.196 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:30 compute-0 podman[220131]: 2026-02-27 17:20:30.646109146 +0000 UTC m=+0.049926550 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 27 17:20:30 compute-0 nova_compute[186840]: 2026-02-27 17:20:30.770 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:31 compute-0 ovn_controller[96756]: 2026-02-27T17:20:31Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:51:ca:75 10.100.0.11
Feb 27 17:20:31 compute-0 ovn_controller[96756]: 2026-02-27T17:20:31Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:51:ca:75 10.100.0.11
Feb 27 17:20:34 compute-0 nova_compute[186840]: 2026-02-27 17:20:34.200 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:35 compute-0 nova_compute[186840]: 2026-02-27 17:20:35.774 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:38 compute-0 nova_compute[186840]: 2026-02-27 17:20:38.670 186844 INFO nova.compute.manager [None req-de94452c-801e-4758-985c-886c9f940323 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Get console output
Feb 27 17:20:38 compute-0 nova_compute[186840]: 2026-02-27 17:20:38.674 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:20:39 compute-0 nova_compute[186840]: 2026-02-27 17:20:39.203 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:39 compute-0 nova_compute[186840]: 2026-02-27 17:20:39.469 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:39 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:39.469 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:20:39 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:39.471 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:20:40 compute-0 nova_compute[186840]: 2026-02-27 17:20:40.777 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:41 compute-0 ovn_controller[96756]: 2026-02-27T17:20:41Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:51:ca:75 10.100.0.11
Feb 27 17:20:41 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:41.474 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:20:42 compute-0 podman[220177]: 2026-02-27 17:20:42.656493746 +0000 UTC m=+0.062175042 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:20:42 compute-0 podman[220178]: 2026-02-27 17:20:42.657525492 +0000 UTC m=+0.064053659 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.205 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.219 186844 DEBUG nova.compute.manager [req-558ae56c-6ff6-46a6-aeb3-c940864572b3 req-8d22383f-e389-41fc-b728-e41aa5be8ba8 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-changed-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.219 186844 DEBUG nova.compute.manager [req-558ae56c-6ff6-46a6-aeb3-c940864572b3 req-8d22383f-e389-41fc-b728-e41aa5be8ba8 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Refreshing instance network info cache due to event network-changed-2ac9a504-9a49-47ff-a072-fc80d6b7f534. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.220 186844 DEBUG oslo_concurrency.lockutils [req-558ae56c-6ff6-46a6-aeb3-c940864572b3 req-8d22383f-e389-41fc-b728-e41aa5be8ba8 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.220 186844 DEBUG oslo_concurrency.lockutils [req-558ae56c-6ff6-46a6-aeb3-c940864572b3 req-8d22383f-e389-41fc-b728-e41aa5be8ba8 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.221 186844 DEBUG nova.network.neutron [req-558ae56c-6ff6-46a6-aeb3-c940864572b3 req-8d22383f-e389-41fc-b728-e41aa5be8ba8 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Refreshing network info cache for port 2ac9a504-9a49-47ff-a072-fc80d6b7f534 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.274 186844 DEBUG oslo_concurrency.lockutils [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.275 186844 DEBUG oslo_concurrency.lockutils [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.275 186844 DEBUG oslo_concurrency.lockutils [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.276 186844 DEBUG oslo_concurrency.lockutils [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.276 186844 DEBUG oslo_concurrency.lockutils [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.278 186844 INFO nova.compute.manager [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Terminating instance
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.280 186844 DEBUG nova.compute.manager [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:20:44 compute-0 kernel: tap2ac9a504-9a (unregistering): left promiscuous mode
Feb 27 17:20:44 compute-0 NetworkManager[56537]: <info>  [1772212844.3093] device (tap2ac9a504-9a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.309 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00132|binding|INFO|Releasing lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 from this chassis (sb_readonly=0)
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00133|binding|INFO|Setting lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 down in Southbound
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00134|binding|INFO|Removing iface tap2ac9a504-9a ovn-installed in OVS
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.317 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.326 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.327 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:ca:75 10.100.0.11'], port_security=['fa:16:3e:51:ca:75 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '45c48dce-50d6-45f9-98c5-85fa5eb52b61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0ba36c2-bb45-4368-b6ae-92ebcb190139', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '39cbff6a-15d0-47da-8b62-9565d0d65b72', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec70ccd9-966e-4a90-92cd-d89fd669fdb9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=2ac9a504-9a49-47ff-a072-fc80d6b7f534) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.329 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 2ac9a504-9a49-47ff-a072-fc80d6b7f534 in datapath f0ba36c2-bb45-4368-b6ae-92ebcb190139 unbound from our chassis
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.331 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f0ba36c2-bb45-4368-b6ae-92ebcb190139, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.333 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[ca2370f4-4a96-41f7-812e-748e46e8d22d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.333 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139 namespace which is not needed anymore
Feb 27 17:20:44 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Feb 27 17:20:44 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000a.scope: Consumed 11.736s CPU time.
Feb 27 17:20:44 compute-0 systemd-machined[156136]: Machine qemu-10-instance-0000000a terminated.
Feb 27 17:20:44 compute-0 neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139[220095]: [NOTICE]   (220099) : haproxy version is 2.8.14-c23fe91
Feb 27 17:20:44 compute-0 neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139[220095]: [NOTICE]   (220099) : path to executable is /usr/sbin/haproxy
Feb 27 17:20:44 compute-0 neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139[220095]: [WARNING]  (220099) : Exiting Master process...
Feb 27 17:20:44 compute-0 neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139[220095]: [ALERT]    (220099) : Current worker (220101) exited with code 143 (Terminated)
Feb 27 17:20:44 compute-0 neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139[220095]: [WARNING]  (220099) : All workers exited. Exiting... (0)
Feb 27 17:20:44 compute-0 systemd[1]: libpod-41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141.scope: Deactivated successfully.
Feb 27 17:20:44 compute-0 podman[220245]: 2026-02-27 17:20:44.446472921 +0000 UTC m=+0.042809149 container died 41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:20:44 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141-userdata-shm.mount: Deactivated successfully.
Feb 27 17:20:44 compute-0 systemd[1]: var-lib-containers-storage-overlay-10e031c26218337305968741a63c60047ee362b1d500d97e93710bd717c6a9ce-merged.mount: Deactivated successfully.
Feb 27 17:20:44 compute-0 podman[220245]: 2026-02-27 17:20:44.489192867 +0000 UTC m=+0.085529105 container cleanup 41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:20:44 compute-0 systemd[1]: libpod-conmon-41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141.scope: Deactivated successfully.
Feb 27 17:20:44 compute-0 kernel: tap2ac9a504-9a: entered promiscuous mode
Feb 27 17:20:44 compute-0 NetworkManager[56537]: <info>  [1772212844.5016] manager: (tap2ac9a504-9a): new Tun device (/org/freedesktop/NetworkManager/Devices/77)
Feb 27 17:20:44 compute-0 kernel: tap2ac9a504-9a (unregistering): left promiscuous mode
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.503 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00135|binding|INFO|Claiming lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 for this chassis.
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00136|binding|INFO|2ac9a504-9a49-47ff-a072-fc80d6b7f534: Claiming fa:16:3e:51:ca:75 10.100.0.11
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00137|binding|INFO|Setting lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 ovn-installed in OVS
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00138|binding|INFO|Setting lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 up in Southbound
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00139|binding|INFO|Releasing lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 from this chassis (sb_readonly=1)
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00140|if_status|INFO|Not setting lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 down as sb is readonly
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.515 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:ca:75 10.100.0.11'], port_security=['fa:16:3e:51:ca:75 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '45c48dce-50d6-45f9-98c5-85fa5eb52b61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0ba36c2-bb45-4368-b6ae-92ebcb190139', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '39cbff6a-15d0-47da-8b62-9565d0d65b72', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec70ccd9-966e-4a90-92cd-d89fd669fdb9, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=2ac9a504-9a49-47ff-a072-fc80d6b7f534) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00141|binding|INFO|Removing iface tap2ac9a504-9a ovn-installed in OVS
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.516 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.521 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00142|binding|INFO|Releasing lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 from this chassis (sb_readonly=0)
Feb 27 17:20:44 compute-0 ovn_controller[96756]: 2026-02-27T17:20:44Z|00143|binding|INFO|Setting lport 2ac9a504-9a49-47ff-a072-fc80d6b7f534 down in Southbound
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.535 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:ca:75 10.100.0.11'], port_security=['fa:16:3e:51:ca:75 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '45c48dce-50d6-45f9-98c5-85fa5eb52b61', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0ba36c2-bb45-4368-b6ae-92ebcb190139', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '39cbff6a-15d0-47da-8b62-9565d0d65b72', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec70ccd9-966e-4a90-92cd-d89fd669fdb9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=2ac9a504-9a49-47ff-a072-fc80d6b7f534) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.547 186844 INFO nova.virt.libvirt.driver [-] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Instance destroyed successfully.
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.547 186844 DEBUG nova.objects.instance [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid 45c48dce-50d6-45f9-98c5-85fa5eb52b61 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.565 186844 DEBUG nova.virt.libvirt.vif [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:20:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1948732682',display_name='tempest-TestNetworkBasicOps-server-1948732682',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1948732682',id=10,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAZpMH2svT+uPhWoBoRCUdLAhaSIdrh9zxtS+bFDvUoYUDLQ5r0uxCjzhdN3tdqzUb9iWTNBRIUPXIzn3UG2fcwjEESz0IsmWqXCuZE0BFT2xT4N0+0NLKCZYOFGG/IThg==',key_name='tempest-TestNetworkBasicOps-1690156444',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:20:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-wqtbnqx0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:20:20Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=45c48dce-50d6-45f9-98c5-85fa5eb52b61,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.566 186844 DEBUG nova.network.os_vif_util [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.567 186844 DEBUG nova.network.os_vif_util [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:51:ca:75,bridge_name='br-int',has_traffic_filtering=True,id=2ac9a504-9a49-47ff-a072-fc80d6b7f534,network=Network(f0ba36c2-bb45-4368-b6ae-92ebcb190139),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ac9a504-9a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.568 186844 DEBUG os_vif [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:51:ca:75,bridge_name='br-int',has_traffic_filtering=True,id=2ac9a504-9a49-47ff-a072-fc80d6b7f534,network=Network(f0ba36c2-bb45-4368-b6ae-92ebcb190139),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ac9a504-9a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:20:44 compute-0 podman[220274]: 2026-02-27 17:20:44.570117815 +0000 UTC m=+0.050900526 container remove 41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.570 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.571 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2ac9a504-9a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.572 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.574 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[74c495d7-5b1c-4c51-b000-d2d7ba1dd4b9]: (4, ('Fri Feb 27 05:20:44 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139 (41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141)\n41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141\nFri Feb 27 05:20:44 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139 (41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141)\n41fa13368a0b82a8abaeeb5669bc18751da5827e0c46eccc1c2a7f4d0fe1b141\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.575 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.576 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[7fbb5b65-e5d2-439c-8b51-83e50123b973]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.577 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0ba36c2-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:20:44 compute-0 kernel: tapf0ba36c2-b0: left promiscuous mode
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.578 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.581 186844 INFO os_vif [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:51:ca:75,bridge_name='br-int',has_traffic_filtering=True,id=2ac9a504-9a49-47ff-a072-fc80d6b7f534,network=Network(f0ba36c2-bb45-4368-b6ae-92ebcb190139),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2ac9a504-9a')
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.581 186844 INFO nova.virt.libvirt.driver [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Deleting instance files /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61_del
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.581 186844 INFO nova.virt.libvirt.driver [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Deletion of /var/lib/nova/instances/45c48dce-50d6-45f9-98c5-85fa5eb52b61_del complete
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.584 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.586 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[acff7de0-c45c-4c43-9924-296646db5b05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.606 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[a3dcc30a-29c8-4b3c-bba1-a5ed2b1a79ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.607 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[822f1a29-ae9d-4966-b370-8f41832a6c9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.621 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[087fa5ac-4a2f-48be-8b48-62a5514cdf61]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 369571, 'reachable_time': 20072, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220297, 'error': None, 'target': 'ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.624 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f0ba36c2-bb45-4368-b6ae-92ebcb190139 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.624 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[b0363b38-8447-4a29-bd42-5c0608e4e40a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:44 compute-0 systemd[1]: run-netns-ovnmeta\x2df0ba36c2\x2dbb45\x2d4368\x2db6ae\x2d92ebcb190139.mount: Deactivated successfully.
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.625 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 2ac9a504-9a49-47ff-a072-fc80d6b7f534 in datapath f0ba36c2-bb45-4368-b6ae-92ebcb190139 unbound from our chassis
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.626 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f0ba36c2-bb45-4368-b6ae-92ebcb190139, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.627 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[88898019-6dc9-4753-95f2-beec8b94865c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.627 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 2ac9a504-9a49-47ff-a072-fc80d6b7f534 in datapath f0ba36c2-bb45-4368-b6ae-92ebcb190139 unbound from our chassis
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.628 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f0ba36c2-bb45-4368-b6ae-92ebcb190139, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:20:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:44.629 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[7154eed8-01fa-4bde-aa74-7f05911abbca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.645 186844 INFO nova.compute.manager [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Took 0.36 seconds to destroy the instance on the hypervisor.
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.646 186844 DEBUG oslo.service.loopingcall [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.646 186844 DEBUG nova.compute.manager [-] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:20:44 compute-0 nova_compute[186840]: 2026-02-27 17:20:44.646 186844 DEBUG nova.network.neutron [-] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.560 186844 DEBUG nova.network.neutron [-] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.587 186844 INFO nova.compute.manager [-] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Took 0.94 seconds to deallocate network for instance.
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.647 186844 DEBUG nova.compute.manager [req-751fba09-592e-409b-bcf4-ac01f589ba97 req-722beeaa-f96e-40c6-a9b6-0ccb2619f273 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-vif-deleted-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.655 186844 DEBUG nova.network.neutron [req-558ae56c-6ff6-46a6-aeb3-c940864572b3 req-8d22383f-e389-41fc-b728-e41aa5be8ba8 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Updated VIF entry in instance network info cache for port 2ac9a504-9a49-47ff-a072-fc80d6b7f534. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.656 186844 DEBUG nova.network.neutron [req-558ae56c-6ff6-46a6-aeb3-c940864572b3 req-8d22383f-e389-41fc-b728-e41aa5be8ba8 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Updating instance_info_cache with network_info: [{"id": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "address": "fa:16:3e:51:ca:75", "network": {"id": "f0ba36c2-bb45-4368-b6ae-92ebcb190139", "bridge": "br-int", "label": "tempest-network-smoke--1718668512", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2ac9a504-9a", "ovs_interfaceid": "2ac9a504-9a49-47ff-a072-fc80d6b7f534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.659 186844 DEBUG oslo_concurrency.lockutils [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.659 186844 DEBUG oslo_concurrency.lockutils [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.681 186844 DEBUG oslo_concurrency.lockutils [req-558ae56c-6ff6-46a6-aeb3-c940864572b3 req-8d22383f-e389-41fc-b728-e41aa5be8ba8 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-45c48dce-50d6-45f9-98c5-85fa5eb52b61" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.722 186844 DEBUG nova.compute.provider_tree [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.741 186844 DEBUG nova.scheduler.client.report [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.765 186844 DEBUG oslo_concurrency.lockutils [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.780 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.809 186844 INFO nova.scheduler.client.report [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance 45c48dce-50d6-45f9-98c5-85fa5eb52b61
Feb 27 17:20:45 compute-0 nova_compute[186840]: 2026-02-27 17:20:45.878 186844 DEBUG oslo_concurrency.lockutils [None req-22dd7fc1-bb26-4608-b811-1f13aa8a7390 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.323 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-vif-unplugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.323 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.324 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.324 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.324 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] No waiting events found dispatching network-vif-unplugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.324 186844 WARNING nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received unexpected event network-vif-unplugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 for instance with vm_state deleted and task_state None.
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.324 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.324 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.325 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.325 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.325 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] No waiting events found dispatching network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.325 186844 WARNING nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received unexpected event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 for instance with vm_state deleted and task_state None.
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.325 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.325 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.326 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.326 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.326 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] No waiting events found dispatching network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.326 186844 WARNING nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received unexpected event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 for instance with vm_state deleted and task_state None.
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.326 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.326 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.327 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.327 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.327 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] No waiting events found dispatching network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.327 186844 WARNING nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received unexpected event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 for instance with vm_state deleted and task_state None.
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.327 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-vif-unplugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.327 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.327 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.328 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.328 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] No waiting events found dispatching network-vif-unplugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.328 186844 WARNING nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received unexpected event network-vif-unplugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 for instance with vm_state deleted and task_state None.
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.328 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.328 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.328 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.328 186844 DEBUG oslo_concurrency.lockutils [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "45c48dce-50d6-45f9-98c5-85fa5eb52b61-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.329 186844 DEBUG nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] No waiting events found dispatching network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:20:46 compute-0 nova_compute[186840]: 2026-02-27 17:20:46.329 186844 WARNING nova.compute.manager [req-a4e582bf-5df9-4800-b55b-dfdf98dac3fe req-919f0192-329a-476f-a939-da93f0c928d0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Received unexpected event network-vif-plugged-2ac9a504-9a49-47ff-a072-fc80d6b7f534 for instance with vm_state deleted and task_state None.
Feb 27 17:20:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:47.094 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:20:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:47.095 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:20:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:20:47.095 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:20:47 compute-0 podman[220298]: 2026-02-27 17:20:47.695189199 +0000 UTC m=+0.087513415 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Feb 27 17:20:47 compute-0 podman[220299]: 2026-02-27 17:20:47.716896351 +0000 UTC m=+0.109700210 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Feb 27 17:20:49 compute-0 nova_compute[186840]: 2026-02-27 17:20:49.574 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:50 compute-0 nova_compute[186840]: 2026-02-27 17:20:50.826 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:51 compute-0 nova_compute[186840]: 2026-02-27 17:20:51.469 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:51 compute-0 nova_compute[186840]: 2026-02-27 17:20:51.500 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:54 compute-0 nova_compute[186840]: 2026-02-27 17:20:54.630 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:55 compute-0 nova_compute[186840]: 2026-02-27 17:20:55.865 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:20:56 compute-0 podman[220348]: 2026-02-27 17:20:56.646758747 +0000 UTC m=+0.048541375 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:20:59 compute-0 nova_compute[186840]: 2026-02-27 17:20:59.545 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212844.5426989, 45c48dce-50d6-45f9-98c5-85fa5eb52b61 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:20:59 compute-0 nova_compute[186840]: 2026-02-27 17:20:59.545 186844 INFO nova.compute.manager [-] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] VM Stopped (Lifecycle Event)
Feb 27 17:20:59 compute-0 nova_compute[186840]: 2026-02-27 17:20:59.575 186844 DEBUG nova.compute.manager [None req-bf0b6165-5dd8-4aa9-9ef2-05d5dad91e74 - - - - - -] [instance: 45c48dce-50d6-45f9-98c5-85fa5eb52b61] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:20:59 compute-0 nova_compute[186840]: 2026-02-27 17:20:59.632 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:00 compute-0 nova_compute[186840]: 2026-02-27 17:21:00.898 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:01 compute-0 podman[220370]: 2026-02-27 17:21:01.66108659 +0000 UTC m=+0.067225520 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:21:04 compute-0 nova_compute[186840]: 2026-02-27 17:21:04.668 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:05 compute-0 nova_compute[186840]: 2026-02-27 17:21:05.933 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:09 compute-0 nova_compute[186840]: 2026-02-27 17:21:09.714 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:10 compute-0 nova_compute[186840]: 2026-02-27 17:21:10.955 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:11 compute-0 nova_compute[186840]: 2026-02-27 17:21:11.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:21:13 compute-0 podman[220395]: 2026-02-27 17:21:13.664998677 +0000 UTC m=+0.074870493 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 27 17:21:13 compute-0 podman[220396]: 2026-02-27 17:21:13.665415727 +0000 UTC m=+0.066530287 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 27 17:21:13 compute-0 nova_compute[186840]: 2026-02-27 17:21:13.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:21:13 compute-0 nova_compute[186840]: 2026-02-27 17:21:13.737 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:13 compute-0 nova_compute[186840]: 2026-02-27 17:21:13.737 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:13 compute-0 nova_compute[186840]: 2026-02-27 17:21:13.738 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:13 compute-0 nova_compute[186840]: 2026-02-27 17:21:13.738 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:21:13 compute-0 nova_compute[186840]: 2026-02-27 17:21:13.944 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:21:13 compute-0 nova_compute[186840]: 2026-02-27 17:21:13.947 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5785MB free_disk=73.19436645507812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:21:13 compute-0 nova_compute[186840]: 2026-02-27 17:21:13.947 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:13 compute-0 nova_compute[186840]: 2026-02-27 17:21:13.948 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:14 compute-0 nova_compute[186840]: 2026-02-27 17:21:14.038 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:21:14 compute-0 nova_compute[186840]: 2026-02-27 17:21:14.039 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:21:14 compute-0 nova_compute[186840]: 2026-02-27 17:21:14.082 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:21:14 compute-0 nova_compute[186840]: 2026-02-27 17:21:14.131 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:21:14 compute-0 nova_compute[186840]: 2026-02-27 17:21:14.161 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:21:14 compute-0 nova_compute[186840]: 2026-02-27 17:21:14.161 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:14 compute-0 nova_compute[186840]: 2026-02-27 17:21:14.718 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.162 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.306 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.306 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.326 186844 DEBUG nova.compute.manager [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.423 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.424 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.431 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.432 186844 INFO nova.compute.claims [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.587 186844 DEBUG nova.compute.provider_tree [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.628 186844 DEBUG nova.scheduler.client.report [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.661 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.662 186844 DEBUG nova.compute.manager [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.776 186844 DEBUG nova.compute.manager [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.777 186844 DEBUG nova.network.neutron [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.807 186844 INFO nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.848 186844 DEBUG nova.compute.manager [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.958 186844 DEBUG nova.policy [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:21:15 compute-0 nova_compute[186840]: 2026-02-27 17:21:15.998 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.024 186844 DEBUG nova.compute.manager [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.026 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.027 186844 INFO nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Creating image(s)
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.027 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.028 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.029 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.054 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.115 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.116 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.117 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.126 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.179 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.180 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.208 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk 1073741824" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.209 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.210 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.281 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.282 186844 DEBUG nova.virt.disk.api [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.283 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.336 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.337 186844 DEBUG nova.virt.disk.api [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.338 186844 DEBUG nova.objects.instance [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid 76796d65-1ae6-49b8-a27c-99e9f3e98d8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.360 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.361 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Ensure instance console log exists: /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.361 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.362 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.363 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.676 186844 DEBUG nova.network.neutron [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Successfully created port: 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.727 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Feb 27 17:21:16 compute-0 nova_compute[186840]: 2026-02-27 17:21:16.727 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:21:17 compute-0 nova_compute[186840]: 2026-02-27 17:21:17.722 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:21:18 compute-0 nova_compute[186840]: 2026-02-27 17:21:18.270 186844 DEBUG nova.network.neutron [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Successfully updated port: 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:21:18 compute-0 nova_compute[186840]: 2026-02-27 17:21:18.323 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:21:18 compute-0 nova_compute[186840]: 2026-02-27 17:21:18.323 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:21:18 compute-0 nova_compute[186840]: 2026-02-27 17:21:18.324 186844 DEBUG nova.network.neutron [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:21:18 compute-0 nova_compute[186840]: 2026-02-27 17:21:18.398 186844 DEBUG nova.compute.manager [req-536c77e6-69c1-404f-82f4-b8e8be3c53d0 req-1d8775f5-6a54-4d3b-bb52-6a92d6f012f0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-changed-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:21:18 compute-0 nova_compute[186840]: 2026-02-27 17:21:18.399 186844 DEBUG nova.compute.manager [req-536c77e6-69c1-404f-82f4-b8e8be3c53d0 req-1d8775f5-6a54-4d3b-bb52-6a92d6f012f0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Refreshing instance network info cache due to event network-changed-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:21:18 compute-0 nova_compute[186840]: 2026-02-27 17:21:18.400 186844 DEBUG oslo_concurrency.lockutils [req-536c77e6-69c1-404f-82f4-b8e8be3c53d0 req-1d8775f5-6a54-4d3b-bb52-6a92d6f012f0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:21:18 compute-0 nova_compute[186840]: 2026-02-27 17:21:18.533 186844 DEBUG nova.network.neutron [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:21:18 compute-0 podman[220453]: 2026-02-27 17:21:18.666904386 +0000 UTC m=+0.062952059 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=openstack_network_exporter, name=ubi9/ubi-minimal)
Feb 27 17:21:18 compute-0 nova_compute[186840]: 2026-02-27 17:21:18.694 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:21:18 compute-0 podman[220454]: 2026-02-27 17:21:18.696413946 +0000 UTC m=+0.091957236 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 27 17:21:18 compute-0 nova_compute[186840]: 2026-02-27 17:21:18.713 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.218 186844 DEBUG nova.network.neutron [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updating instance_info_cache with network_info: [{"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.345 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.346 186844 DEBUG nova.compute.manager [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Instance network_info: |[{"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.347 186844 DEBUG oslo_concurrency.lockutils [req-536c77e6-69c1-404f-82f4-b8e8be3c53d0 req-1d8775f5-6a54-4d3b-bb52-6a92d6f012f0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.347 186844 DEBUG nova.network.neutron [req-536c77e6-69c1-404f-82f4-b8e8be3c53d0 req-1d8775f5-6a54-4d3b-bb52-6a92d6f012f0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Refreshing network info cache for port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.351 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Start _get_guest_xml network_info=[{"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.356 186844 WARNING nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.382 186844 DEBUG nova.virt.libvirt.host [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.384 186844 DEBUG nova.virt.libvirt.host [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.424 186844 DEBUG nova.virt.libvirt.host [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.425 186844 DEBUG nova.virt.libvirt.host [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.426 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.427 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.427 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.428 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.428 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.428 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.428 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.429 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.429 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.429 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.429 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.430 186844 DEBUG nova.virt.hardware [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.433 186844 DEBUG nova.virt.libvirt.vif [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:21:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1377424799',display_name='tempest-TestNetworkBasicOps-server-1377424799',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1377424799',id=11,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB5yMj7rv2FoFn43N91mInOFoiaVjdK+YUgcyY4/JaxhlF+sTH28XfB7CGBYhg4B37Cdd0aIrCrJzciXqQQtOxDwn9R4t3S5eghkg4mO9lFmSAyX4zi+cmINV24rq34ZTA==',key_name='tempest-TestNetworkBasicOps-287151659',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-2d6jv0aw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:21:15Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=76796d65-1ae6-49b8-a27c-99e9f3e98d8f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.433 186844 DEBUG nova.network.os_vif_util [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.434 186844 DEBUG nova.network.os_vif_util [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:bf:ef,bridge_name='br-int',has_traffic_filtering=True,id=3e762eca-ffd0-4082-a5a7-bd3e3880d6fb,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e762eca-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.434 186844 DEBUG nova.objects.instance [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 76796d65-1ae6-49b8-a27c-99e9f3e98d8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.541 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:21:19 compute-0 nova_compute[186840]:   <uuid>76796d65-1ae6-49b8-a27c-99e9f3e98d8f</uuid>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   <name>instance-0000000b</name>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1377424799</nova:name>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:21:19</nova:creationTime>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:21:19 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:21:19 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:21:19 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:21:19 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:21:19 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:21:19 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:21:19 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:21:19 compute-0 nova_compute[186840]:         <nova:port uuid="3e762eca-ffd0-4082-a5a7-bd3e3880d6fb">
Feb 27 17:21:19 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <system>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <entry name="serial">76796d65-1ae6-49b8-a27c-99e9f3e98d8f</entry>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <entry name="uuid">76796d65-1ae6-49b8-a27c-99e9f3e98d8f</entry>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     </system>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   <os>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   </os>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   <features>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   </features>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.config"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:65:bf:ef"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <target dev="tap3e762eca-ff"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/console.log" append="off"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <video>
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     </video>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:21:19 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:21:19 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:21:19 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:21:19 compute-0 nova_compute[186840]: </domain>
Feb 27 17:21:19 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.542 186844 DEBUG nova.compute.manager [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Preparing to wait for external event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.543 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.544 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.544 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.545 186844 DEBUG nova.virt.libvirt.vif [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:21:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1377424799',display_name='tempest-TestNetworkBasicOps-server-1377424799',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1377424799',id=11,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB5yMj7rv2FoFn43N91mInOFoiaVjdK+YUgcyY4/JaxhlF+sTH28XfB7CGBYhg4B37Cdd0aIrCrJzciXqQQtOxDwn9R4t3S5eghkg4mO9lFmSAyX4zi+cmINV24rq34ZTA==',key_name='tempest-TestNetworkBasicOps-287151659',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-2d6jv0aw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:21:15Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=76796d65-1ae6-49b8-a27c-99e9f3e98d8f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.545 186844 DEBUG nova.network.os_vif_util [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.546 186844 DEBUG nova.network.os_vif_util [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:bf:ef,bridge_name='br-int',has_traffic_filtering=True,id=3e762eca-ffd0-4082-a5a7-bd3e3880d6fb,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e762eca-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.547 186844 DEBUG os_vif [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:bf:ef,bridge_name='br-int',has_traffic_filtering=True,id=3e762eca-ffd0-4082-a5a7-bd3e3880d6fb,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e762eca-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.548 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.548 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.549 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.552 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.553 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3e762eca-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.553 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3e762eca-ff, col_values=(('external_ids', {'iface-id': '3e762eca-ffd0-4082-a5a7-bd3e3880d6fb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:bf:ef', 'vm-uuid': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:19 compute-0 NetworkManager[56537]: <info>  [1772212879.5559] manager: (tap3e762eca-ff): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.556 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.558 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.562 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.563 186844 INFO os_vif [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:bf:ef,bridge_name='br-int',has_traffic_filtering=True,id=3e762eca-ffd0-4082-a5a7-bd3e3880d6fb,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e762eca-ff')
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.615 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.616 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.616 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:65:bf:ef, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.617 186844 INFO nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Using config drive
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.944 186844 INFO nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Creating config drive at /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.config
Feb 27 17:21:19 compute-0 nova_compute[186840]: 2026-02-27 17:21:19.949 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpvbrlbt1y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.075 186844 DEBUG oslo_concurrency.processutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmpvbrlbt1y" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:20 compute-0 kernel: tap3e762eca-ff: entered promiscuous mode
Feb 27 17:21:20 compute-0 NetworkManager[56537]: <info>  [1772212880.1435] manager: (tap3e762eca-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.145 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:20 compute-0 ovn_controller[96756]: 2026-02-27T17:21:20Z|00144|binding|INFO|Claiming lport 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb for this chassis.
Feb 27 17:21:20 compute-0 ovn_controller[96756]: 2026-02-27T17:21:20Z|00145|binding|INFO|3e762eca-ffd0-4082-a5a7-bd3e3880d6fb: Claiming fa:16:3e:65:bf:ef 10.100.0.3
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.153 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.155 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.163 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:bf:ef 10.100.0.3'], port_security=['fa:16:3e:65:bf:ef 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fd5bbb1b-a90c-4d23-afdc-bfe41a6e60ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5da8242-881c-4f4b-b09e-baa93811cec7, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=3e762eca-ffd0-4082-a5a7-bd3e3880d6fb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.165 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb in datapath f9adecf3-bded-4f54-b5e0-7e0c3564bf2a bound to our chassis
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.168 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9adecf3-bded-4f54-b5e0-7e0c3564bf2a
Feb 27 17:21:20 compute-0 ovn_controller[96756]: 2026-02-27T17:21:20Z|00146|binding|INFO|Setting lport 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb ovn-installed in OVS
Feb 27 17:21:20 compute-0 ovn_controller[96756]: 2026-02-27T17:21:20Z|00147|binding|INFO|Setting lport 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb up in Southbound
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.172 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:20 compute-0 systemd-machined[156136]: New machine qemu-11-instance-0000000b.
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.178 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[e563b6bb-d9a9-4d84-a137-20fb4ead6843]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.180 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf9adecf3-b1 in ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.182 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf9adecf3-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.182 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[1df6dea0-0724-4212-b98b-ec35d5fc7970]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.183 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3a55ced5-45ba-4a6f-b5e7-18ad6f045a91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.194 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[319d75b2-35dc-47b4-ba62-aaab8433b242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000b.
Feb 27 17:21:20 compute-0 systemd-udevd[220523]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.205 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[e5b417ab-29ab-4231-abfb-fff09f31878f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 NetworkManager[56537]: <info>  [1772212880.2165] device (tap3e762eca-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:21:20 compute-0 NetworkManager[56537]: <info>  [1772212880.2175] device (tap3e762eca-ff): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.235 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a1c190-e376-4be9-b31d-049b2cb03ae1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 NetworkManager[56537]: <info>  [1772212880.2433] manager: (tapf9adecf3-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/80)
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.242 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[7942f3f9-8c06-4477-a0cb-4000646d1332]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.269 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[00ac03d0-dc66-46aa-b7e8-f79a237f7a56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.272 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[7061e69a-0879-416a-aec7-1f950aac0ffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 NetworkManager[56537]: <info>  [1772212880.2900] device (tapf9adecf3-b0): carrier: link connected
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.293 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[434eb7ab-e782-41be-b290-b17a5c18202c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.307 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[72978825-3a4e-4aff-80b1-808d5e2091b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9adecf3-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:6e:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375578, 'reachable_time': 23085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220553, 'error': None, 'target': 'ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.318 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[c42e537b-9a50-436f-9adf-e1972746376e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef8:6ef0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375578, 'tstamp': 375578}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220554, 'error': None, 'target': 'ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.329 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[40f63370-7a27-448e-97c3-58f993dbc823]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9adecf3-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:6e:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375578, 'reachable_time': 23085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220555, 'error': None, 'target': 'ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.355 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[44d63b34-aaa2-456d-89ed-a9cbafb4ea19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.372 186844 DEBUG nova.compute.manager [req-a98595e8-0e0f-4ff4-bbfb-7d06aa9b8568 req-923fabe5-96ca-4289-888c-5d0776f38743 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.373 186844 DEBUG oslo_concurrency.lockutils [req-a98595e8-0e0f-4ff4-bbfb-7d06aa9b8568 req-923fabe5-96ca-4289-888c-5d0776f38743 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.373 186844 DEBUG oslo_concurrency.lockutils [req-a98595e8-0e0f-4ff4-bbfb-7d06aa9b8568 req-923fabe5-96ca-4289-888c-5d0776f38743 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.374 186844 DEBUG oslo_concurrency.lockutils [req-a98595e8-0e0f-4ff4-bbfb-7d06aa9b8568 req-923fabe5-96ca-4289-888c-5d0776f38743 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.374 186844 DEBUG nova.compute.manager [req-a98595e8-0e0f-4ff4-bbfb-7d06aa9b8568 req-923fabe5-96ca-4289-888c-5d0776f38743 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Processing event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.404 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[88f91648-4545-4e6f-b4d3-0a6f5fdb605c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.405 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9adecf3-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.406 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.407 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9adecf3-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.408 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:20 compute-0 kernel: tapf9adecf3-b0: entered promiscuous mode
Feb 27 17:21:20 compute-0 NetworkManager[56537]: <info>  [1772212880.4097] manager: (tapf9adecf3-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.411 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.413 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9adecf3-b0, col_values=(('external_ids', {'iface-id': '528129d5-a74d-4559-b145-2d5af576200c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.414 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:20 compute-0 ovn_controller[96756]: 2026-02-27T17:21:20Z|00148|binding|INFO|Releasing lport 528129d5-a74d-4559-b145-2d5af576200c from this chassis (sb_readonly=0)
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.419 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.420 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f9adecf3-bded-4f54-b5e0-7e0c3564bf2a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f9adecf3-bded-4f54-b5e0-7e0c3564bf2a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.421 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[1dfcc77a-9925-4780-973f-ca135770208f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.421 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/f9adecf3-bded-4f54-b5e0-7e0c3564bf2a.pid.haproxy
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID f9adecf3-bded-4f54-b5e0-7e0c3564bf2a
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:21:20 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:20.422 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'env', 'PROCESS_TAG=haproxy-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f9adecf3-bded-4f54-b5e0-7e0c3564bf2a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.444 186844 DEBUG nova.network.neutron [req-536c77e6-69c1-404f-82f4-b8e8be3c53d0 req-1d8775f5-6a54-4d3b-bb52-6a92d6f012f0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updated VIF entry in instance network info cache for port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.444 186844 DEBUG nova.network.neutron [req-536c77e6-69c1-404f-82f4-b8e8be3c53d0 req-1d8775f5-6a54-4d3b-bb52-6a92d6f012f0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updating instance_info_cache with network_info: [{"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.464 186844 DEBUG oslo_concurrency.lockutils [req-536c77e6-69c1-404f-82f4-b8e8be3c53d0 req-1d8775f5-6a54-4d3b-bb52-6a92d6f012f0 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.518 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212880.517779, 76796d65-1ae6-49b8-a27c-99e9f3e98d8f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.518 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] VM Started (Lifecycle Event)
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.521 186844 DEBUG nova.compute.manager [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.526 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.531 186844 INFO nova.virt.libvirt.driver [-] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Instance spawned successfully.
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.532 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.537 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.541 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.550 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.551 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.551 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.552 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.552 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.553 186844 DEBUG nova.virt.libvirt.driver [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.559 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.560 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212880.5187848, 76796d65-1ae6-49b8-a27c-99e9f3e98d8f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.560 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] VM Paused (Lifecycle Event)
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.591 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.602 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212880.526313, 76796d65-1ae6-49b8-a27c-99e9f3e98d8f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.602 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] VM Resumed (Lifecycle Event)
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.609 186844 INFO nova.compute.manager [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Took 4.58 seconds to spawn the instance on the hypervisor.
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.610 186844 DEBUG nova.compute.manager [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.618 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.621 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.638 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.664 186844 INFO nova.compute.manager [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Took 5.27 seconds to build instance.
Feb 27 17:21:20 compute-0 nova_compute[186840]: 2026-02-27 17:21:20.697 186844 DEBUG oslo_concurrency.lockutils [None req-f2af326f-3c3e-4c56-bfa6-4751b7224b3e 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:20 compute-0 podman[220594]: 2026-02-27 17:21:20.741366639 +0000 UTC m=+0.045894306 container create 457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:21:20 compute-0 systemd[1]: Started libpod-conmon-457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4.scope.
Feb 27 17:21:20 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:21:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15c4b5caa68f6c3a2727ea18a10b879d21b7a96036422b43dcdb4b724348c2a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:21:20 compute-0 podman[220594]: 2026-02-27 17:21:20.716191697 +0000 UTC m=+0.020719374 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:21:20 compute-0 podman[220594]: 2026-02-27 17:21:20.822450835 +0000 UTC m=+0.126978552 container init 457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0)
Feb 27 17:21:20 compute-0 podman[220594]: 2026-02-27 17:21:20.826077085 +0000 UTC m=+0.130604762 container start 457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, tcib_managed=true)
Feb 27 17:21:20 compute-0 neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a[220609]: [NOTICE]   (220613) : New worker (220615) forked
Feb 27 17:21:20 compute-0 neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a[220609]: [NOTICE]   (220613) : Loading success.
Feb 27 17:21:21 compute-0 nova_compute[186840]: 2026-02-27 17:21:21.002 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:21 compute-0 nova_compute[186840]: 2026-02-27 17:21:21.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:21:21 compute-0 nova_compute[186840]: 2026-02-27 17:21:21.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:21:22 compute-0 nova_compute[186840]: 2026-02-27 17:21:22.461 186844 DEBUG nova.compute.manager [req-2653572c-f67c-42fb-aeb3-48b6b941ec22 req-94e18f1e-42af-4f97-acd6-dd2cabeb7304 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:21:22 compute-0 nova_compute[186840]: 2026-02-27 17:21:22.462 186844 DEBUG oslo_concurrency.lockutils [req-2653572c-f67c-42fb-aeb3-48b6b941ec22 req-94e18f1e-42af-4f97-acd6-dd2cabeb7304 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:22 compute-0 nova_compute[186840]: 2026-02-27 17:21:22.463 186844 DEBUG oslo_concurrency.lockutils [req-2653572c-f67c-42fb-aeb3-48b6b941ec22 req-94e18f1e-42af-4f97-acd6-dd2cabeb7304 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:22 compute-0 nova_compute[186840]: 2026-02-27 17:21:22.463 186844 DEBUG oslo_concurrency.lockutils [req-2653572c-f67c-42fb-aeb3-48b6b941ec22 req-94e18f1e-42af-4f97-acd6-dd2cabeb7304 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:22 compute-0 nova_compute[186840]: 2026-02-27 17:21:22.464 186844 DEBUG nova.compute.manager [req-2653572c-f67c-42fb-aeb3-48b6b941ec22 req-94e18f1e-42af-4f97-acd6-dd2cabeb7304 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] No waiting events found dispatching network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:21:22 compute-0 nova_compute[186840]: 2026-02-27 17:21:22.465 186844 WARNING nova.compute.manager [req-2653572c-f67c-42fb-aeb3-48b6b941ec22 req-94e18f1e-42af-4f97-acd6-dd2cabeb7304 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received unexpected event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb for instance with vm_state active and task_state None.
Feb 27 17:21:24 compute-0 nova_compute[186840]: 2026-02-27 17:21:24.556 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:24 compute-0 ovn_controller[96756]: 2026-02-27T17:21:24Z|00149|binding|INFO|Releasing lport 528129d5-a74d-4559-b145-2d5af576200c from this chassis (sb_readonly=0)
Feb 27 17:21:24 compute-0 nova_compute[186840]: 2026-02-27 17:21:24.660 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:24 compute-0 NetworkManager[56537]: <info>  [1772212884.6611] manager: (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Feb 27 17:21:24 compute-0 NetworkManager[56537]: <info>  [1772212884.6626] manager: (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Feb 27 17:21:24 compute-0 nova_compute[186840]: 2026-02-27 17:21:24.667 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:24 compute-0 ovn_controller[96756]: 2026-02-27T17:21:24Z|00150|binding|INFO|Releasing lport 528129d5-a74d-4559-b145-2d5af576200c from this chassis (sb_readonly=0)
Feb 27 17:21:24 compute-0 nova_compute[186840]: 2026-02-27 17:21:24.953 186844 DEBUG nova.compute.manager [req-7cfb57bb-b8ef-442e-abdb-5d347360d857 req-22a6d588-96a9-47e6-92e0-2768a03c03d4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-changed-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:21:24 compute-0 nova_compute[186840]: 2026-02-27 17:21:24.954 186844 DEBUG nova.compute.manager [req-7cfb57bb-b8ef-442e-abdb-5d347360d857 req-22a6d588-96a9-47e6-92e0-2768a03c03d4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Refreshing instance network info cache due to event network-changed-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:21:24 compute-0 nova_compute[186840]: 2026-02-27 17:21:24.955 186844 DEBUG oslo_concurrency.lockutils [req-7cfb57bb-b8ef-442e-abdb-5d347360d857 req-22a6d588-96a9-47e6-92e0-2768a03c03d4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:21:24 compute-0 nova_compute[186840]: 2026-02-27 17:21:24.955 186844 DEBUG oslo_concurrency.lockutils [req-7cfb57bb-b8ef-442e-abdb-5d347360d857 req-22a6d588-96a9-47e6-92e0-2768a03c03d4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:21:24 compute-0 nova_compute[186840]: 2026-02-27 17:21:24.956 186844 DEBUG nova.network.neutron [req-7cfb57bb-b8ef-442e-abdb-5d347360d857 req-22a6d588-96a9-47e6-92e0-2768a03c03d4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Refreshing network info cache for port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:21:26 compute-0 nova_compute[186840]: 2026-02-27 17:21:26.004 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:26 compute-0 nova_compute[186840]: 2026-02-27 17:21:26.017 186844 DEBUG nova.network.neutron [req-7cfb57bb-b8ef-442e-abdb-5d347360d857 req-22a6d588-96a9-47e6-92e0-2768a03c03d4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updated VIF entry in instance network info cache for port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:21:26 compute-0 nova_compute[186840]: 2026-02-27 17:21:26.018 186844 DEBUG nova.network.neutron [req-7cfb57bb-b8ef-442e-abdb-5d347360d857 req-22a6d588-96a9-47e6-92e0-2768a03c03d4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updating instance_info_cache with network_info: [{"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:21:26 compute-0 nova_compute[186840]: 2026-02-27 17:21:26.038 186844 DEBUG oslo_concurrency.lockutils [req-7cfb57bb-b8ef-442e-abdb-5d347360d857 req-22a6d588-96a9-47e6-92e0-2768a03c03d4 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:21:27 compute-0 podman[220625]: 2026-02-27 17:21:27.682698504 +0000 UTC m=+0.075869348 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 27 17:21:29 compute-0 nova_compute[186840]: 2026-02-27 17:21:29.560 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:31 compute-0 nova_compute[186840]: 2026-02-27 17:21:31.008 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:32 compute-0 ovn_controller[96756]: 2026-02-27T17:21:32Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:65:bf:ef 10.100.0.3
Feb 27 17:21:32 compute-0 ovn_controller[96756]: 2026-02-27T17:21:32Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:bf:ef 10.100.0.3
Feb 27 17:21:32 compute-0 podman[220657]: 2026-02-27 17:21:32.659551963 +0000 UTC m=+0.063525492 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 27 17:21:32 compute-0 nova_compute[186840]: 2026-02-27 17:21:32.687 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "8f00a140-3d50-4972-b03d-28a0e12800c8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:32 compute-0 nova_compute[186840]: 2026-02-27 17:21:32.688 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:32 compute-0 nova_compute[186840]: 2026-02-27 17:21:32.706 186844 DEBUG nova.compute.manager [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:21:32 compute-0 nova_compute[186840]: 2026-02-27 17:21:32.797 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:32 compute-0 nova_compute[186840]: 2026-02-27 17:21:32.798 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:32 compute-0 nova_compute[186840]: 2026-02-27 17:21:32.808 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:21:32 compute-0 nova_compute[186840]: 2026-02-27 17:21:32.808 186844 INFO nova.compute.claims [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:21:32 compute-0 nova_compute[186840]: 2026-02-27 17:21:32.945 186844 DEBUG nova.compute.provider_tree [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:21:32 compute-0 nova_compute[186840]: 2026-02-27 17:21:32.970 186844 DEBUG nova.scheduler.client.report [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.000 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.001 186844 DEBUG nova.compute.manager [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.052 186844 DEBUG nova.compute.manager [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.052 186844 DEBUG nova.network.neutron [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.077 186844 INFO nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.099 186844 DEBUG nova.compute.manager [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.202 186844 DEBUG nova.compute.manager [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.204 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.204 186844 INFO nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Creating image(s)
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.205 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.206 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.207 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.233 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.306 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.307 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.307 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.318 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.361 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.363 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.402 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.403 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.403 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.482 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.484 186844 DEBUG nova.virt.disk.api [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.484 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.542 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.544 186844 DEBUG nova.virt.disk.api [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.544 186844 DEBUG nova.objects.instance [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid 8f00a140-3d50-4972-b03d-28a0e12800c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.789 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.789 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Ensure instance console log exists: /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.790 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.791 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.791 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:33 compute-0 nova_compute[186840]: 2026-02-27 17:21:33.980 186844 DEBUG nova.policy [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:21:34 compute-0 nova_compute[186840]: 2026-02-27 17:21:34.565 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:34 compute-0 nova_compute[186840]: 2026-02-27 17:21:34.994 186844 DEBUG nova.network.neutron [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Successfully created port: ebac04fa-73a7-4be5-b486-8b87c3270448 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:21:35 compute-0 nova_compute[186840]: 2026-02-27 17:21:35.775 186844 DEBUG nova.network.neutron [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Successfully updated port: ebac04fa-73a7-4be5-b486-8b87c3270448 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:21:35 compute-0 nova_compute[186840]: 2026-02-27 17:21:35.802 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:21:35 compute-0 nova_compute[186840]: 2026-02-27 17:21:35.802 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:21:35 compute-0 nova_compute[186840]: 2026-02-27 17:21:35.803 186844 DEBUG nova.network.neutron [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:21:35 compute-0 nova_compute[186840]: 2026-02-27 17:21:35.896 186844 DEBUG nova.compute.manager [req-d50d3eac-34b5-4087-aaeb-dc22a8f86fba req-5ba13686-7427-4f5a-b468-086d3bdf3eed 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Received event network-changed-ebac04fa-73a7-4be5-b486-8b87c3270448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:21:35 compute-0 nova_compute[186840]: 2026-02-27 17:21:35.896 186844 DEBUG nova.compute.manager [req-d50d3eac-34b5-4087-aaeb-dc22a8f86fba req-5ba13686-7427-4f5a-b468-086d3bdf3eed 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Refreshing instance network info cache due to event network-changed-ebac04fa-73a7-4be5-b486-8b87c3270448. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:21:35 compute-0 nova_compute[186840]: 2026-02-27 17:21:35.897 186844 DEBUG oslo_concurrency.lockutils [req-d50d3eac-34b5-4087-aaeb-dc22a8f86fba req-5ba13686-7427-4f5a-b468-086d3bdf3eed 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:21:35 compute-0 nova_compute[186840]: 2026-02-27 17:21:35.966 186844 DEBUG nova.network.neutron [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:21:36 compute-0 nova_compute[186840]: 2026-02-27 17:21:36.010 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.200 186844 DEBUG nova.network.neutron [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Updating instance_info_cache with network_info: [{"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.235 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.235 186844 DEBUG nova.compute.manager [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Instance network_info: |[{"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.236 186844 DEBUG oslo_concurrency.lockutils [req-d50d3eac-34b5-4087-aaeb-dc22a8f86fba req-5ba13686-7427-4f5a-b468-086d3bdf3eed 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.237 186844 DEBUG nova.network.neutron [req-d50d3eac-34b5-4087-aaeb-dc22a8f86fba req-5ba13686-7427-4f5a-b468-086d3bdf3eed 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Refreshing network info cache for port ebac04fa-73a7-4be5-b486-8b87c3270448 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.242 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Start _get_guest_xml network_info=[{"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.248 186844 WARNING nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.256 186844 DEBUG nova.virt.libvirt.host [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.256 186844 DEBUG nova.virt.libvirt.host [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.264 186844 DEBUG nova.virt.libvirt.host [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.265 186844 DEBUG nova.virt.libvirt.host [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.266 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.266 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.267 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.267 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.268 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.269 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.269 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.270 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.270 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.271 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.271 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.272 186844 DEBUG nova.virt.hardware [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.278 186844 DEBUG nova.virt.libvirt.vif [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1090587586',display_name='tempest-TestNetworkBasicOps-server-1090587586',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1090587586',id=12,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBENlNHBW2+Q5G7+YcIic0pFABdrSs8/tx6oq/c4459Blm4jV0iJ8gdIDrvawoG2LeCANWjKTh1pRvxpwMa8STXngLk0QWms6S10pber5A7UFowOroastyEwgDpMm42OA0Q==',key_name='tempest-TestNetworkBasicOps-1020147262',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-ms4byc60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:21:33Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=8f00a140-3d50-4972-b03d-28a0e12800c8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.278 186844 DEBUG nova.network.os_vif_util [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.280 186844 DEBUG nova.network.os_vif_util [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:df:80,bridge_name='br-int',has_traffic_filtering=True,id=ebac04fa-73a7-4be5-b486-8b87c3270448,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebac04fa-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.281 186844 DEBUG nova.objects.instance [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 8f00a140-3d50-4972-b03d-28a0e12800c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.304 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:21:37 compute-0 nova_compute[186840]:   <uuid>8f00a140-3d50-4972-b03d-28a0e12800c8</uuid>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   <name>instance-0000000c</name>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1090587586</nova:name>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:21:37</nova:creationTime>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:21:37 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:21:37 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:21:37 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:21:37 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:21:37 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:21:37 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:21:37 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:21:37 compute-0 nova_compute[186840]:         <nova:port uuid="ebac04fa-73a7-4be5-b486-8b87c3270448">
Feb 27 17:21:37 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <system>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <entry name="serial">8f00a140-3d50-4972-b03d-28a0e12800c8</entry>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <entry name="uuid">8f00a140-3d50-4972-b03d-28a0e12800c8</entry>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     </system>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   <os>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   </os>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   <features>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   </features>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk.config"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:32:df:80"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <target dev="tapebac04fa-73"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/console.log" append="off"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <video>
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     </video>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:21:37 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:21:37 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:21:37 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:21:37 compute-0 nova_compute[186840]: </domain>
Feb 27 17:21:37 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.306 186844 DEBUG nova.compute.manager [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Preparing to wait for external event network-vif-plugged-ebac04fa-73a7-4be5-b486-8b87c3270448 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.306 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.307 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.308 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.309 186844 DEBUG nova.virt.libvirt.vif [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1090587586',display_name='tempest-TestNetworkBasicOps-server-1090587586',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1090587586',id=12,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBENlNHBW2+Q5G7+YcIic0pFABdrSs8/tx6oq/c4459Blm4jV0iJ8gdIDrvawoG2LeCANWjKTh1pRvxpwMa8STXngLk0QWms6S10pber5A7UFowOroastyEwgDpMm42OA0Q==',key_name='tempest-TestNetworkBasicOps-1020147262',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-ms4byc60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:21:33Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=8f00a140-3d50-4972-b03d-28a0e12800c8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.309 186844 DEBUG nova.network.os_vif_util [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.310 186844 DEBUG nova.network.os_vif_util [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:32:df:80,bridge_name='br-int',has_traffic_filtering=True,id=ebac04fa-73a7-4be5-b486-8b87c3270448,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebac04fa-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.311 186844 DEBUG os_vif [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:df:80,bridge_name='br-int',has_traffic_filtering=True,id=ebac04fa-73a7-4be5-b486-8b87c3270448,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebac04fa-73') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.312 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.312 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.313 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.317 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.317 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapebac04fa-73, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.318 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapebac04fa-73, col_values=(('external_ids', {'iface-id': 'ebac04fa-73a7-4be5-b486-8b87c3270448', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:32:df:80', 'vm-uuid': '8f00a140-3d50-4972-b03d-28a0e12800c8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.320 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:37 compute-0 NetworkManager[56537]: <info>  [1772212897.3217] manager: (tapebac04fa-73): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.326 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.327 186844 INFO os_vif [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:32:df:80,bridge_name='br-int',has_traffic_filtering=True,id=ebac04fa-73a7-4be5-b486-8b87c3270448,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebac04fa-73')
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.401 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.402 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.402 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:32:df:80, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:21:37 compute-0 nova_compute[186840]: 2026-02-27 17:21:37.403 186844 INFO nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Using config drive
Feb 27 17:21:38 compute-0 nova_compute[186840]: 2026-02-27 17:21:38.169 186844 INFO nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Creating config drive at /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk.config
Feb 27 17:21:38 compute-0 nova_compute[186840]: 2026-02-27 17:21:38.177 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp4ilmv4hh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:21:38 compute-0 nova_compute[186840]: 2026-02-27 17:21:38.302 186844 DEBUG oslo_concurrency.processutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp4ilmv4hh" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:21:38 compute-0 kernel: tapebac04fa-73: entered promiscuous mode
Feb 27 17:21:38 compute-0 NetworkManager[56537]: <info>  [1772212898.3584] manager: (tapebac04fa-73): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Feb 27 17:21:38 compute-0 ovn_controller[96756]: 2026-02-27T17:21:38Z|00151|binding|INFO|Claiming lport ebac04fa-73a7-4be5-b486-8b87c3270448 for this chassis.
Feb 27 17:21:38 compute-0 ovn_controller[96756]: 2026-02-27T17:21:38Z|00152|binding|INFO|ebac04fa-73a7-4be5-b486-8b87c3270448: Claiming fa:16:3e:32:df:80 10.100.0.13
Feb 27 17:21:38 compute-0 nova_compute[186840]: 2026-02-27 17:21:38.366 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:38 compute-0 ovn_controller[96756]: 2026-02-27T17:21:38Z|00153|binding|INFO|Setting lport ebac04fa-73a7-4be5-b486-8b87c3270448 ovn-installed in OVS
Feb 27 17:21:38 compute-0 ovn_controller[96756]: 2026-02-27T17:21:38Z|00154|binding|INFO|Setting lport ebac04fa-73a7-4be5-b486-8b87c3270448 up in Southbound
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.378 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:df:80 10.100.0.13'], port_security=['fa:16:3e:32:df:80 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b3452a48-3d12-4bdb-ba02-d5fde1af6478', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5da8242-881c-4f4b-b09e-baa93811cec7, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=ebac04fa-73a7-4be5-b486-8b87c3270448) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:21:38 compute-0 nova_compute[186840]: 2026-02-27 17:21:38.379 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.382 106085 INFO neutron.agent.ovn.metadata.agent [-] Port ebac04fa-73a7-4be5-b486-8b87c3270448 in datapath f9adecf3-bded-4f54-b5e0-7e0c3564bf2a bound to our chassis
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.385 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9adecf3-bded-4f54-b5e0-7e0c3564bf2a
Feb 27 17:21:38 compute-0 systemd-machined[156136]: New machine qemu-12-instance-0000000c.
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.400 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[ea61bf20-e2ef-47a9-a49b-0292994d3b14]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:38 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-0000000c.
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.433 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[a2df4a48-282c-43ff-9a7c-df7c7cb969d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:38 compute-0 systemd-udevd[220721]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.438 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[14820d98-5a1f-4482-833f-4a9f2e5fb35c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:38 compute-0 NetworkManager[56537]: <info>  [1772212898.4577] device (tapebac04fa-73): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:21:38 compute-0 NetworkManager[56537]: <info>  [1772212898.4588] device (tapebac04fa-73): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.471 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb71bac-f9a2-4bb4-943b-189f7abc8d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.488 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[1ba1d340-936b-4439-b462-110cf81fb510]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9adecf3-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:6e:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 7, 'tx_packets': 6, 'rx_bytes': 574, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 7, 'tx_packets': 6, 'rx_bytes': 574, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375578, 'reachable_time': 23085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220731, 'error': None, 'target': 'ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.505 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[56f01baa-4953-4c59-936e-79438dff8bc3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf9adecf3-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375586, 'tstamp': 375586}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220733, 'error': None, 'target': 'ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf9adecf3-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375589, 'tstamp': 375589}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220733, 'error': None, 'target': 'ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.507 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9adecf3-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.511 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9adecf3-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.512 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.513 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9adecf3-b0, col_values=(('external_ids', {'iface-id': '528129d5-a74d-4559-b145-2d5af576200c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:38 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:38.513 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:21:38 compute-0 nova_compute[186840]: 2026-02-27 17:21:38.514 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:38 compute-0 nova_compute[186840]: 2026-02-27 17:21:38.616 186844 DEBUG nova.compute.manager [req-a3dd45cd-b00d-48c8-9f95-6890f781c551 req-4c07236a-81bf-4c2b-a9e6-76a15ed1d11d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Received event network-vif-plugged-ebac04fa-73a7-4be5-b486-8b87c3270448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:21:38 compute-0 nova_compute[186840]: 2026-02-27 17:21:38.617 186844 DEBUG oslo_concurrency.lockutils [req-a3dd45cd-b00d-48c8-9f95-6890f781c551 req-4c07236a-81bf-4c2b-a9e6-76a15ed1d11d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:38 compute-0 nova_compute[186840]: 2026-02-27 17:21:38.618 186844 DEBUG oslo_concurrency.lockutils [req-a3dd45cd-b00d-48c8-9f95-6890f781c551 req-4c07236a-81bf-4c2b-a9e6-76a15ed1d11d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:38 compute-0 nova_compute[186840]: 2026-02-27 17:21:38.618 186844 DEBUG oslo_concurrency.lockutils [req-a3dd45cd-b00d-48c8-9f95-6890f781c551 req-4c07236a-81bf-4c2b-a9e6-76a15ed1d11d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:38 compute-0 nova_compute[186840]: 2026-02-27 17:21:38.618 186844 DEBUG nova.compute.manager [req-a3dd45cd-b00d-48c8-9f95-6890f781c551 req-4c07236a-81bf-4c2b-a9e6-76a15ed1d11d 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Processing event network-vif-plugged-ebac04fa-73a7-4be5-b486-8b87c3270448 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.028 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212899.0280735, 8f00a140-3d50-4972-b03d-28a0e12800c8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.029 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] VM Started (Lifecycle Event)
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.032 186844 DEBUG nova.compute.manager [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.037 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.042 186844 INFO nova.virt.libvirt.driver [-] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Instance spawned successfully.
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.042 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.063 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.071 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.081 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.082 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.083 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.084 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.084 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.085 186844 DEBUG nova.virt.libvirt.driver [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.098 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.099 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212899.0283844, 8f00a140-3d50-4972-b03d-28a0e12800c8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.099 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] VM Paused (Lifecycle Event)
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.128 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.133 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212899.0359364, 8f00a140-3d50-4972-b03d-28a0e12800c8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.134 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] VM Resumed (Lifecycle Event)
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.160 186844 INFO nova.compute.manager [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Took 5.96 seconds to spawn the instance on the hypervisor.
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.161 186844 DEBUG nova.compute.manager [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.162 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.167 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.178 186844 DEBUG nova.network.neutron [req-d50d3eac-34b5-4087-aaeb-dc22a8f86fba req-5ba13686-7427-4f5a-b468-086d3bdf3eed 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Updated VIF entry in instance network info cache for port ebac04fa-73a7-4be5-b486-8b87c3270448. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.178 186844 DEBUG nova.network.neutron [req-d50d3eac-34b5-4087-aaeb-dc22a8f86fba req-5ba13686-7427-4f5a-b468-086d3bdf3eed 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Updating instance_info_cache with network_info: [{"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.278 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.418 186844 INFO nova.compute.manager [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Took 6.66 seconds to build instance.
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.438 186844 DEBUG oslo_concurrency.lockutils [req-d50d3eac-34b5-4087-aaeb-dc22a8f86fba req-5ba13686-7427-4f5a-b468-086d3bdf3eed 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:21:39 compute-0 nova_compute[186840]: 2026-02-27 17:21:39.479 186844 DEBUG oslo_concurrency.lockutils [None req-e8978b6d-3455-4027-b061-56be3ce64062 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.791s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:40 compute-0 nova_compute[186840]: 2026-02-27 17:21:40.738 186844 DEBUG nova.compute.manager [req-bd759b3a-1270-451c-9c92-2ab17e490dc8 req-76b928b7-338c-4bde-8f92-6601809a7d5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Received event network-vif-plugged-ebac04fa-73a7-4be5-b486-8b87c3270448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:21:40 compute-0 nova_compute[186840]: 2026-02-27 17:21:40.738 186844 DEBUG oslo_concurrency.lockutils [req-bd759b3a-1270-451c-9c92-2ab17e490dc8 req-76b928b7-338c-4bde-8f92-6601809a7d5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:40 compute-0 nova_compute[186840]: 2026-02-27 17:21:40.738 186844 DEBUG oslo_concurrency.lockutils [req-bd759b3a-1270-451c-9c92-2ab17e490dc8 req-76b928b7-338c-4bde-8f92-6601809a7d5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:40 compute-0 nova_compute[186840]: 2026-02-27 17:21:40.739 186844 DEBUG oslo_concurrency.lockutils [req-bd759b3a-1270-451c-9c92-2ab17e490dc8 req-76b928b7-338c-4bde-8f92-6601809a7d5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:40 compute-0 nova_compute[186840]: 2026-02-27 17:21:40.739 186844 DEBUG nova.compute.manager [req-bd759b3a-1270-451c-9c92-2ab17e490dc8 req-76b928b7-338c-4bde-8f92-6601809a7d5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] No waiting events found dispatching network-vif-plugged-ebac04fa-73a7-4be5-b486-8b87c3270448 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:21:40 compute-0 nova_compute[186840]: 2026-02-27 17:21:40.739 186844 WARNING nova.compute.manager [req-bd759b3a-1270-451c-9c92-2ab17e490dc8 req-76b928b7-338c-4bde-8f92-6601809a7d5b 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Received unexpected event network-vif-plugged-ebac04fa-73a7-4be5-b486-8b87c3270448 for instance with vm_state active and task_state None.
Feb 27 17:21:41 compute-0 nova_compute[186840]: 2026-02-27 17:21:41.012 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:42 compute-0 nova_compute[186840]: 2026-02-27 17:21:42.321 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:44.128 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:21:44 compute-0 nova_compute[186840]: 2026-02-27 17:21:44.129 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:44 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:44.129 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:21:44 compute-0 nova_compute[186840]: 2026-02-27 17:21:44.684 186844 DEBUG nova.compute.manager [req-db8868a5-b899-4ec6-845b-1f07846864c3 req-dc2e24ee-6ea6-4bd3-aab4-a8b4e5cc91e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Received event network-changed-ebac04fa-73a7-4be5-b486-8b87c3270448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:21:44 compute-0 nova_compute[186840]: 2026-02-27 17:21:44.684 186844 DEBUG nova.compute.manager [req-db8868a5-b899-4ec6-845b-1f07846864c3 req-dc2e24ee-6ea6-4bd3-aab4-a8b4e5cc91e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Refreshing instance network info cache due to event network-changed-ebac04fa-73a7-4be5-b486-8b87c3270448. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:21:44 compute-0 nova_compute[186840]: 2026-02-27 17:21:44.685 186844 DEBUG oslo_concurrency.lockutils [req-db8868a5-b899-4ec6-845b-1f07846864c3 req-dc2e24ee-6ea6-4bd3-aab4-a8b4e5cc91e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:21:44 compute-0 nova_compute[186840]: 2026-02-27 17:21:44.685 186844 DEBUG oslo_concurrency.lockutils [req-db8868a5-b899-4ec6-845b-1f07846864c3 req-dc2e24ee-6ea6-4bd3-aab4-a8b4e5cc91e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:21:44 compute-0 nova_compute[186840]: 2026-02-27 17:21:44.685 186844 DEBUG nova.network.neutron [req-db8868a5-b899-4ec6-845b-1f07846864c3 req-dc2e24ee-6ea6-4bd3-aab4-a8b4e5cc91e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Refreshing network info cache for port ebac04fa-73a7-4be5-b486-8b87c3270448 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:21:44 compute-0 podman[220743]: 2026-02-27 17:21:44.69847637 +0000 UTC m=+0.093523764 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 27 17:21:44 compute-0 podman[220742]: 2026-02-27 17:21:44.713458881 +0000 UTC m=+0.109710695 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:21:45 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:45.132 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:21:46 compute-0 nova_compute[186840]: 2026-02-27 17:21:46.013 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:46 compute-0 nova_compute[186840]: 2026-02-27 17:21:46.223 186844 DEBUG nova.network.neutron [req-db8868a5-b899-4ec6-845b-1f07846864c3 req-dc2e24ee-6ea6-4bd3-aab4-a8b4e5cc91e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Updated VIF entry in instance network info cache for port ebac04fa-73a7-4be5-b486-8b87c3270448. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:21:46 compute-0 nova_compute[186840]: 2026-02-27 17:21:46.223 186844 DEBUG nova.network.neutron [req-db8868a5-b899-4ec6-845b-1f07846864c3 req-dc2e24ee-6ea6-4bd3-aab4-a8b4e5cc91e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Updating instance_info_cache with network_info: [{"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:21:46 compute-0 nova_compute[186840]: 2026-02-27 17:21:46.256 186844 DEBUG oslo_concurrency.lockutils [req-db8868a5-b899-4ec6-845b-1f07846864c3 req-dc2e24ee-6ea6-4bd3-aab4-a8b4e5cc91e7 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:21:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:47.095 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:21:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:47.096 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:21:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:21:47.096 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:21:47 compute-0 nova_compute[186840]: 2026-02-27 17:21:47.358 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:49 compute-0 podman[220804]: 2026-02-27 17:21:49.657332744 +0000 UTC m=+0.058512908 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, name=ubi9/ubi-minimal, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git)
Feb 27 17:21:49 compute-0 podman[220805]: 2026-02-27 17:21:49.907093063 +0000 UTC m=+0.301693144 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:21:50 compute-0 ovn_controller[96756]: 2026-02-27T17:21:50Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:32:df:80 10.100.0.13
Feb 27 17:21:50 compute-0 ovn_controller[96756]: 2026-02-27T17:21:50Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:32:df:80 10.100.0.13
Feb 27 17:21:51 compute-0 nova_compute[186840]: 2026-02-27 17:21:51.016 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:52 compute-0 nova_compute[186840]: 2026-02-27 17:21:52.361 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:56 compute-0 nova_compute[186840]: 2026-02-27 17:21:56.018 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:57 compute-0 nova_compute[186840]: 2026-02-27 17:21:57.364 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:21:58 compute-0 podman[220852]: 2026-02-27 17:21:58.66901345 +0000 UTC m=+0.068598638 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:22:00 compute-0 nova_compute[186840]: 2026-02-27 17:22:00.350 186844 INFO nova.compute.manager [None req-83e789b9-8f90-44cc-bd69-d02633dd636f 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Get console output
Feb 27 17:22:00 compute-0 nova_compute[186840]: 2026-02-27 17:22:00.359 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:22:01 compute-0 nova_compute[186840]: 2026-02-27 17:22:01.020 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:01 compute-0 nova_compute[186840]: 2026-02-27 17:22:01.379 186844 DEBUG nova.compute.manager [req-4b41986f-c94a-45c3-8996-f4fd110da53a req-88c71590-1b45-4902-8b6a-8186abe783cc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-changed-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:01 compute-0 nova_compute[186840]: 2026-02-27 17:22:01.380 186844 DEBUG nova.compute.manager [req-4b41986f-c94a-45c3-8996-f4fd110da53a req-88c71590-1b45-4902-8b6a-8186abe783cc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Refreshing instance network info cache due to event network-changed-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:22:01 compute-0 nova_compute[186840]: 2026-02-27 17:22:01.380 186844 DEBUG oslo_concurrency.lockutils [req-4b41986f-c94a-45c3-8996-f4fd110da53a req-88c71590-1b45-4902-8b6a-8186abe783cc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:22:01 compute-0 nova_compute[186840]: 2026-02-27 17:22:01.381 186844 DEBUG oslo_concurrency.lockutils [req-4b41986f-c94a-45c3-8996-f4fd110da53a req-88c71590-1b45-4902-8b6a-8186abe783cc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:22:01 compute-0 nova_compute[186840]: 2026-02-27 17:22:01.381 186844 DEBUG nova.network.neutron [req-4b41986f-c94a-45c3-8996-f4fd110da53a req-88c71590-1b45-4902-8b6a-8186abe783cc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Refreshing network info cache for port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:22:02 compute-0 nova_compute[186840]: 2026-02-27 17:22:02.366 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:02 compute-0 nova_compute[186840]: 2026-02-27 17:22:02.391 186844 INFO nova.compute.manager [None req-dff44696-3e82-4117-9f62-d343e5bdeb3f 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Get console output
Feb 27 17:22:02 compute-0 nova_compute[186840]: 2026-02-27 17:22:02.398 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:22:02 compute-0 nova_compute[186840]: 2026-02-27 17:22:02.743 186844 DEBUG nova.network.neutron [req-4b41986f-c94a-45c3-8996-f4fd110da53a req-88c71590-1b45-4902-8b6a-8186abe783cc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updated VIF entry in instance network info cache for port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:22:02 compute-0 nova_compute[186840]: 2026-02-27 17:22:02.743 186844 DEBUG nova.network.neutron [req-4b41986f-c94a-45c3-8996-f4fd110da53a req-88c71590-1b45-4902-8b6a-8186abe783cc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updating instance_info_cache with network_info: [{"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:22:02 compute-0 nova_compute[186840]: 2026-02-27 17:22:02.770 186844 DEBUG oslo_concurrency.lockutils [req-4b41986f-c94a-45c3-8996-f4fd110da53a req-88c71590-1b45-4902-8b6a-8186abe783cc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.494 186844 DEBUG nova.compute.manager [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-vif-unplugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.494 186844 DEBUG oslo_concurrency.lockutils [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.495 186844 DEBUG oslo_concurrency.lockutils [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.495 186844 DEBUG oslo_concurrency.lockutils [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.496 186844 DEBUG nova.compute.manager [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] No waiting events found dispatching network-vif-unplugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.496 186844 WARNING nova.compute.manager [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received unexpected event network-vif-unplugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb for instance with vm_state active and task_state None.
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.496 186844 DEBUG nova.compute.manager [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.497 186844 DEBUG oslo_concurrency.lockutils [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.497 186844 DEBUG oslo_concurrency.lockutils [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.498 186844 DEBUG oslo_concurrency.lockutils [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.498 186844 DEBUG nova.compute.manager [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] No waiting events found dispatching network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:22:03 compute-0 nova_compute[186840]: 2026-02-27 17:22:03.498 186844 WARNING nova.compute.manager [req-96a4bacb-b2d8-44b6-b722-55d0b40bdac6 req-eb670a4e-6bd2-4dc4-ba23-554c08e13832 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received unexpected event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb for instance with vm_state active and task_state None.
Feb 27 17:22:03 compute-0 podman[220872]: 2026-02-27 17:22:03.668174912 +0000 UTC m=+0.069545292 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:22:04 compute-0 nova_compute[186840]: 2026-02-27 17:22:04.647 186844 INFO nova.compute.manager [None req-512d2d82-f8eb-4966-8460-58bda7cd757c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Get console output
Feb 27 17:22:04 compute-0 nova_compute[186840]: 2026-02-27 17:22:04.653 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.271 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'name': 'tempest-TestNetworkBasicOps-server-1090587586', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000c', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0922444e0aaf445884a7c2fa20793b1f', 'user_id': '427d6e526715473ebe8997007bbff5cd', 'hostId': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.274 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'name': 'tempest-TestNetworkBasicOps-server-1377424799', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000b', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '0922444e0aaf445884a7c2fa20793b1f', 'user_id': '427d6e526715473ebe8997007bbff5cd', 'hostId': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.274 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.288 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.289 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.303 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.303 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae7bdae2-f479-4a3a-af61-b32aeae6a742', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-vda', 'timestamp': '2026-02-27T17:22:05.274499', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6497bb2-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.819314719, 'message_signature': 'a3684549c5a08d1a95368e1be44c96b150510b104d6f29b1b2164ac5ad48edcf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-sda', 'timestamp': '2026-02-27T17:22:05.274499', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6498ada-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.819314719, 'message_signature': '16f96a450c1e82d3212c1b3bb190f8567f17c426c0503e6e261cffc8a7899812'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-vda', 'timestamp': '2026-02-27T17:22:05.274499', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd64ba8d8-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.834485704, 'message_signature': 'd7b3571794c057a454fa51c5a727ce49daad260c96a5669d13b1e871b9e73164'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-sda', 'timestamp': '2026-02-27T17:22:05.274499', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd64bb4e0-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.834485704, 'message_signature': 'ccb361a11e7c602ef26ff1b40f316e0f8a184a336468c6f3ba82ef6c50245d2c'}]}, 'timestamp': '2026-02-27 17:22:05.303897', '_unique_id': '205568ec8eee4515aebb8e6b55570339'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.305 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.309 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8f00a140-3d50-4972-b03d-28a0e12800c8 / tapebac04fa-73 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.309 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.311 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 76796d65-1ae6-49b8-a27c-99e9f3e98d8f / tap3e762eca-ff inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.311 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb162141-073e-428e-9a97-21c93c705d18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000c-8f00a140-3d50-4972-b03d-28a0e12800c8-tapebac04fa-73', 'timestamp': '2026-02-27T17:22:05.306214', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'tapebac04fa-73', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:df:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapebac04fa-73'}, 'message_id': 'd64c99b4-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.851048664, 'message_signature': '710b1dc16185f8bf2d9fbfc7f6fafa5267387544d98c199883719f308227a34f'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000b-76796d65-1ae6-49b8-a27c-99e9f3e98d8f-tap3e762eca-ff', 'timestamp': '2026-02-27T17:22:05.306214', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'tap3e762eca-ff', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:bf:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e762eca-ff'}, 'message_id': 'd64cfaf8-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.85453594, 'message_signature': '8e6c14616ecb8178a745e21e145e21bea558532d62a860c46541712a6fcb77d8'}]}, 'timestamp': '2026-02-27 17:22:05.312265', '_unique_id': '765695475a7b4a6ca7d5dbe0db35211c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.313 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.314 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1090587586>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1377424799>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1090587586>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1377424799>]
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.314 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.314 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.314 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1731239d-a4a8-43e8-85cb-03db049bf3c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000c-8f00a140-3d50-4972-b03d-28a0e12800c8-tapebac04fa-73', 'timestamp': '2026-02-27T17:22:05.314378', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'tapebac04fa-73', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:df:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapebac04fa-73'}, 'message_id': 'd64d5b88-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.851048664, 'message_signature': '52b7fdbfad86212927b1a479505df9cb9b8dbe11666c815deaaeddf1c804b88b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000b-76796d65-1ae6-49b8-a27c-99e9f3e98d8f-tap3e762eca-ff', 'timestamp': '2026-02-27T17:22:05.314378', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'tap3e762eca-ff', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:bf:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e762eca-ff'}, 'message_id': 'd64d67fe-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.85453594, 'message_signature': '9a57acdaad7d41e688e5c0508ea2148c92e8bd4cfbc379fbc86ac45f3b73d141'}]}, 'timestamp': '2026-02-27 17:22:05.315033', '_unique_id': '2b38e987ee744201a4a5bf1d11c7ccf9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.315 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.316 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.349 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.read.bytes volume: 31144448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.350 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.387 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.read.bytes volume: 31025664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.387 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a120bca2-7f29-4956-86ee-25f97f1de5e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31144448, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-vda', 'timestamp': '2026-02-27T17:22:05.316689', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd652bccc-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': 'a6257881ef539d0f5cb3fbd5a7ae2aa6fff81864b97f3517de002d21b1c6b2d3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-sda', 'timestamp': '2026-02-27T17:22:05.316689', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd652d900-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': '8ae2b195046ed9357080095a95b14c943c502b1663e98d475f4f6b1a8c7395c1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31025664, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-vda', 'timestamp': '2026-02-27T17:22:05.316689', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6587dce-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': '986f24ee21b2c9c490f318ebcfa8b90d473bcac566851fa3931e170d20bd733e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-sda', 'timestamp': '2026-02-27T17:22:05.316689', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6589372-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': '3c41af54d15518dfdefb9d1c119775f1a74356a1b6e42381acd1e4000c909972'}]}, 'timestamp': '2026-02-27 17:22:05.388407', '_unique_id': '3b9362a54b984710abb0806b460bf1dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.389 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.391 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.391 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.391 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1090587586>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1377424799>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1090587586>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1377424799>]
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.391 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.391 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.392 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '236971da-834c-4445-bdfb-23de2ed2125f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000c-8f00a140-3d50-4972-b03d-28a0e12800c8-tapebac04fa-73', 'timestamp': '2026-02-27T17:22:05.391799', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'tapebac04fa-73', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:df:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapebac04fa-73'}, 'message_id': 'd6592e2c-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.851048664, 'message_signature': 'df921b8d82bb5f9b4937e11195f510b163a6973648b0aea36319f8761bf97b0f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000b-76796d65-1ae6-49b8-a27c-99e9f3e98d8f-tap3e762eca-ff', 'timestamp': '2026-02-27T17:22:05.391799', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'tap3e762eca-ff', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:bf:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e762eca-ff'}, 'message_id': 'd65940f6-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.85453594, 'message_signature': 'c53492b82edfe41faaea0f9746ba3ad2e0247acbb5703dfaa80dda7f015fadb4'}]}, 'timestamp': '2026-02-27 17:22:05.392734', '_unique_id': 'a686d28efcc848178469e706c2bd75ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.393 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.394 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.394 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.395 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1090587586>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1377424799>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1090587586>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1377424799>]
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.395 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.395 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.read.latency volume: 700944470 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.395 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.read.latency volume: 62187839 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.396 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.read.latency volume: 609085837 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.396 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.read.latency volume: 47149221 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24bbd39c-393f-4e96-b861-75a36c3ee194', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 700944470, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-vda', 'timestamp': '2026-02-27T17:22:05.395521', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd659bf40-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': 'cb728069d5e3b18db846683c733487b8b5e3edfe11b1928a4c88b6690eaa472d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 62187839, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-sda', 'timestamp': '2026-02-27T17:22:05.395521', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd659cf44-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': 'c1fbdffa0c4d48acb9fad602d6a6d8b1ecf962c0c00a6f5c5de88168c5b0de54'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 609085837, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-vda', 'timestamp': '2026-02-27T17:22:05.395521', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd659dfca-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': '1964e2cf6906c98b861c11b9169429bbe97ea408cec35e50c955d592c9f4939e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 47149221, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-sda', 'timestamp': '2026-02-27T17:22:05.395521', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd659ee84-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': '1d3dd96c5a9436ee9db03d5ecbef9bc7418c1d837caec03c6fed132e37282298'}]}, 'timestamp': '2026-02-27 17:22:05.397151', '_unique_id': 'a2af17331d184dc1bf7f582051aa0c46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 rsyslogd[1012]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.398 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.399 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.399 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.399 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38ffe9e0-30ce-4303-978b-454dbc5e2700', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000c-8f00a140-3d50-4972-b03d-28a0e12800c8-tapebac04fa-73', 'timestamp': '2026-02-27T17:22:05.399471', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'tapebac04fa-73', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:df:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapebac04fa-73'}, 'message_id': 'd65a5982-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.851048664, 'message_signature': 'b93e31a209bb27844f60a5bc9ef2cdf2c8b3ede83fa4effb520c404ce613cff8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000b-76796d65-1ae6-49b8-a27c-99e9f3e98d8f-tap3e762eca-ff', 'timestamp': '2026-02-27T17:22:05.399471', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'tap3e762eca-ff', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:bf:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e762eca-ff'}, 'message_id': 'd65a6a30-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.85453594, 'message_signature': '31a24f3a70ca94212971611e2fa66e6e9f09d40e220edb0bfc3f5eff42bfab70'}]}, 'timestamp': '2026-02-27 17:22:05.400373', '_unique_id': 'a191c13e1ae246de95d3315491926886'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.401 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.404 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.404 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.404 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.405 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.405 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '416e3b4d-f27e-47fb-8174-f317a9bdbcea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-vda', 'timestamp': '2026-02-27T17:22:05.404297', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd65b16b0-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.819314719, 'message_signature': 'ae91eff88411a83d4743cbba4956f7210f1f07f180ffabf892ebc32fd8cc587a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-sda', 'timestamp': '2026-02-27T17:22:05.404297', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd65b2a6a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.819314719, 'message_signature': '60fcaa9a1c2c6b3052122db683def5fd92cba25deb20ea1e8532bb7f92ff8fcd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-vda', 'timestamp': '2026-02-27T17:22:05.404297', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd65b3dac-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.834485704, 'message_signature': 'd176dc9696dd54006678b93c448956f541650a839d249e0f38b0c4594130e770'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-sda', 'timestamp': '2026-02-27T17:22:05.404297', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd65b5102-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.834485704, 'message_signature': '1dd6299c21ed4dbc9432e215c83a6683a187143b67905115f80f8a61a50b06b5'}]}, 'timestamp': '2026-02-27 17:22:05.406299', '_unique_id': 'b27b5bc48b6346759abd81c2d06ecc1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.407 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.408 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.408 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/network.incoming.bytes volume: 13736 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.408 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/network.incoming.bytes volume: 7006 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6083fdd-e6c6-44c1-8f3d-dcab610715c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 13736, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000c-8f00a140-3d50-4972-b03d-28a0e12800c8-tapebac04fa-73', 'timestamp': '2026-02-27T17:22:05.408541', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'tapebac04fa-73', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:df:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapebac04fa-73'}, 'message_id': 'd65bbc14-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.851048664, 'message_signature': '5a2ac2ba5faebaeac90e6dd045310fb994e2c2fe011aeb1c8fbdb39b8b0adb90'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7006, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000b-76796d65-1ae6-49b8-a27c-99e9f3e98d8f-tap3e762eca-ff', 'timestamp': '2026-02-27T17:22:05.408541', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'tap3e762eca-ff', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:bf:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e762eca-ff'}, 'message_id': 'd65bcaf6-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.85453594, 'message_signature': 'ff4e51b72e2c23c411eef00a39b6b4905215863c224ae068c0451ae7827b32de'}]}, 'timestamp': '2026-02-27 17:22:05.409339', '_unique_id': 'aa1529bbf10c4557a9220e3ba0e5e122'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.409 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.410 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.411 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/network.outgoing.bytes volume: 12296 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.411 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/network.outgoing.bytes volume: 5552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '112a61a4-7fea-43d4-afdc-fbeff6f53432', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 12296, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000c-8f00a140-3d50-4972-b03d-28a0e12800c8-tapebac04fa-73', 'timestamp': '2026-02-27T17:22:05.411071', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'tapebac04fa-73', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:df:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapebac04fa-73'}, 'message_id': 'd65c2028-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.851048664, 'message_signature': 'e4385be675cee819a5ff1f3e66c74db9a029a67810763d7ee410f7210cb3d155'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5552, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000b-76796d65-1ae6-49b8-a27c-99e9f3e98d8f-tap3e762eca-ff', 'timestamp': '2026-02-27T17:22:05.411071', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'tap3e762eca-ff', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:bf:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e762eca-ff'}, 'message_id': 'd65c318a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.85453594, 'message_signature': 'ccf0fee3b8358957d5fea8dd32e72a21d32acf64796e6cfad12ee55ee91d7ef8'}]}, 'timestamp': '2026-02-27 17:22:05.411937', '_unique_id': '287ee3bae5c245a3b17ff6476568e620'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.412 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.413 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.431 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/cpu volume: 9930000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 rsyslogd[1012]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.445 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/cpu volume: 10270000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2734a62-e161-4dfc-9cab-823021ac24c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9930000000, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'timestamp': '2026-02-27T17:22:05.413469', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd65f4366-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.976349933, 'message_signature': 'a237862661c8622df1fd5d1a1fee7e0f90dc4648391d92fee10bd5a7663b7d60'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10270000000, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'timestamp': '2026-02-27T17:22:05.413469', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'd6616d08-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.990565465, 'message_signature': 'e36166c7f73e7a58ba7b8321992e841ee3ead89600f24ec84ee6a7f140793d8e'}]}, 'timestamp': '2026-02-27 17:22:05.446274', '_unique_id': '4870ebb6c92344deb1413baa7143d574'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.447 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.448 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.448 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/network.incoming.packets volume: 79 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.448 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/network.incoming.packets volume: 43 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c731add-7f2c-486f-bee8-696f15c6ead7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 79, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000c-8f00a140-3d50-4972-b03d-28a0e12800c8-tapebac04fa-73', 'timestamp': '2026-02-27T17:22:05.448238', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'tapebac04fa-73', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:df:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapebac04fa-73'}, 'message_id': 'd661cbd6-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.851048664, 'message_signature': '1fd88f4165b50962676209ce4ec81ab59331ea12543745ac75f6c3a6b8e94e0a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 43, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000b-76796d65-1ae6-49b8-a27c-99e9f3e98d8f-tap3e762eca-ff', 'timestamp': '2026-02-27T17:22:05.448238', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'tap3e762eca-ff', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:bf:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e762eca-ff'}, 'message_id': 'd661da22-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.85453594, 'message_signature': 'e34baa98dc38e5efa070a97e33385c6d42e2c7e5ece7c0febd9fdd58d95f8897'}]}, 'timestamp': '2026-02-27 17:22:05.449023', '_unique_id': '387aad71a2cd4e61b691ffc094190f0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.449 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.450 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.450 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.451 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.451 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.allocation volume: 30679040 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.451 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0071fc34-0276-4986-aeba-84e3814e83eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-vda', 'timestamp': '2026-02-27T17:22:05.450635', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd66227fc-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.819314719, 'message_signature': '482ac428e4310d697eb72e4b8bcf5f6b47cf6efb404a5520fbe7251ffc50b29e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-sda', 'timestamp': '2026-02-27T17:22:05.450635', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd662353a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.819314719, 'message_signature': 'e78a98f0058c3dbcee3a74532b36f9968f76515b73477339d1b55becffe0b910'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30679040, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-vda', 'timestamp': '2026-02-27T17:22:05.450635', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd66242fa-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.834485704, 'message_signature': 'b4b93d4eaac9a823918216a1a5c45e41c11ff533acd7d00e03c16b1286919c46'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-sda', 'timestamp': '2026-02-27T17:22:05.450635', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6624e8a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.834485704, 'message_signature': 'b333dbcc98a61903b1370d054e69142e066221abf0d7950de6ca29112d0f3e07'}]}, 'timestamp': '2026-02-27 17:22:05.452027', '_unique_id': 'ae8013a16fc146abacae541bbf89c998'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.452 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.453 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.453 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.write.latency volume: 2959027017 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.454 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.454 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.write.latency volume: 3385601429 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.454 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c358477-1443-4469-b048-821ad682ebc7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2959027017, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-vda', 'timestamp': '2026-02-27T17:22:05.453649', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6629d68-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': 'af054f4bff642264fbec4c5c46e379af6671e996cf324517f26be6daecccb00b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-sda', 'timestamp': '2026-02-27T17:22:05.453649', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd662aac4-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': 'e4b9d7df84ec54639e14400a933162f4cc0201092bc75ae388b3430f79f538b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3385601429, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-vda', 'timestamp': '2026-02-27T17:22:05.453649', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd662b7d0-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': '8c8ae77791cb479c29246a2d235f7062efbad08c1f71a7c55ad6ac03a609960b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-sda', 'timestamp': '2026-02-27T17:22:05.453649', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd662c324-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': 'c5223854b6280b264bbf1d1df4b5e031fe1914e9d0a3a0c1cd008364aee0759c'}]}, 'timestamp': '2026-02-27 17:22:05.454967', '_unique_id': '8da35a8ecebe4de99bba2b13eac9af13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.455 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.456 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.456 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/network.outgoing.packets volume: 86 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/network.outgoing.packets volume: 44 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1920a98e-5e3b-4c4b-95d0-324aef6e4d2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 86, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000c-8f00a140-3d50-4972-b03d-28a0e12800c8-tapebac04fa-73', 'timestamp': '2026-02-27T17:22:05.456621', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'tapebac04fa-73', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:df:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapebac04fa-73'}, 'message_id': 'd663119e-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.851048664, 'message_signature': 'ef6b793fd71667dd6dbb3cdb071e22b5c5723ced2f25484481508f01c640f684'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 44, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000b-76796d65-1ae6-49b8-a27c-99e9f3e98d8f-tap3e762eca-ff', 'timestamp': '2026-02-27T17:22:05.456621', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'tap3e762eca-ff', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:bf:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e762eca-ff'}, 'message_id': 'd6631e78-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.85453594, 'message_signature': '99c5690bdf5cc9a248427e80913a995334121ca08b933298df8b241d69fc0e45'}]}, 'timestamp': '2026-02-27 17:22:05.457365', '_unique_id': '8543fe1b75914893a056b8d0329b05fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.457 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.458 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.458 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.write.bytes volume: 72949760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.459 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.459 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.write.bytes volume: 72978432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.459 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb30e5d6-ed69-49aa-a82b-28cb47816c1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72949760, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-vda', 'timestamp': '2026-02-27T17:22:05.458886', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd66369d2-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': '675d303a695ffd7f61ca0a1bf25774b35b4dae7782ac2afe18c0ef84ed43f4e3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-sda', 'timestamp': '2026-02-27T17:22:05.458886', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd66379c2-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': '9306109222d91f0c160c408009aa64c01960df1187807518ea678e64e57318a1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72978432, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-vda', 'timestamp': '2026-02-27T17:22:05.458886', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6638458-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': 'f92d1ce5d789eaea1b04424ea368e06ca49567b1a0fd0dd7745fdb6db7acc234'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-sda', 'timestamp': '2026-02-27T17:22:05.458886', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6638e12-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': 'a007156b79c9b627f045ceb9b5624aa046fae474db5a3ea15a531bba59d2c07a'}]}, 'timestamp': '2026-02-27 17:22:05.460153', '_unique_id': '5703a1f3223c470b8d2c5e8f3f476cbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.460 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.461 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.461 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.461 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1090587586>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1377424799>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1090587586>, <NovaLikeServer: tempest-TestNetworkBasicOps-server-1377424799>]
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.461 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.462 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.462 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ef73ea0-da8b-4a2d-98a3-0c17bdc45e7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000c-8f00a140-3d50-4972-b03d-28a0e12800c8-tapebac04fa-73', 'timestamp': '2026-02-27T17:22:05.461999', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'tapebac04fa-73', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:df:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapebac04fa-73'}, 'message_id': 'd663e11e-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.851048664, 'message_signature': '51abf91bf2b302371355446a4703c866bcab10c099f57f8e0b8dbc7b93b5c234'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000b-76796d65-1ae6-49b8-a27c-99e9f3e98d8f-tap3e762eca-ff', 'timestamp': '2026-02-27T17:22:05.461999', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'tap3e762eca-ff', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:bf:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e762eca-ff'}, 'message_id': 'd663ed4e-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.85453594, 'message_signature': '16989c476f971868ceee4c8c95a24c75ada712b7783e2a77a1d2d81dfa7f7f5f'}]}, 'timestamp': '2026-02-27 17:22:05.462604', '_unique_id': 'efcb8101e11e48bd802c913c1a6b4e1f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.463 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.464 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.464 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6bbafd0-8a82-407b-9a4a-ba33aa921f9d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000c-8f00a140-3d50-4972-b03d-28a0e12800c8-tapebac04fa-73', 'timestamp': '2026-02-27T17:22:05.464035', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'tapebac04fa-73', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:32:df:80', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapebac04fa-73'}, 'message_id': 'd6643452-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.851048664, 'message_signature': 'abf5293dc7049ed9d00c4c1a6f80ce7f95d8fe3fb535696dfe48b55c27e4e617'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': 'instance-0000000b-76796d65-1ae6-49b8-a27c-99e9f3e98d8f-tap3e762eca-ff', 'timestamp': '2026-02-27T17:22:05.464035', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'tap3e762eca-ff', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:65:bf:ef', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3e762eca-ff'}, 'message_id': 'd664442e-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.85453594, 'message_signature': 'd07c30be16c5889915d4948c1e0b7b083a24c1f2431e23680da1c09900c59ba0'}]}, 'timestamp': '2026-02-27 17:22:05.464839', '_unique_id': '10fa3f58fd954d45b40f9fa4a03dbb59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.465 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.466 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.466 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.write.requests volume: 317 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.466 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.466 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.write.requests volume: 310 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.467 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a8c690ae-0f97-4d5c-af3a-f10324681ece', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 317, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-vda', 'timestamp': '2026-02-27T17:22:05.466328', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd6648a10-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': '9746398c15d61af8e12ed939d96b0618aa23a6485f27666a30e6e1f9885c55c8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-sda', 'timestamp': '2026-02-27T17:22:05.466328', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6649582-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': '578cb3a27bbac533bb9e4d1776baad49ba2e820ac5c296ab78253f6b255382a6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 310, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-vda', 'timestamp': '2026-02-27T17:22:05.466328', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd664a07c-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': '9131d70d9481469322c51e65e638bbdbcff3aeaaaa5d580e05df428993282305'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-sda', 'timestamp': '2026-02-27T17:22:05.466328', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd664ac3e-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': '0e16a2c3e5bc40e7d4270174c68dbbc8e9de7f6515f28d3f50f0f698ef7612d9'}]}, 'timestamp': '2026-02-27 17:22:05.467484', '_unique_id': '89bb886c53cb41a998384a6d8bc03ed5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.468 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.469 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.read.requests volume: 1125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.469 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.469 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.read.requests volume: 1137 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.469 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd6ddf39c-0c37-4d22-9a7a-1005ae73fd02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1125, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-vda', 'timestamp': '2026-02-27T17:22:05.469015', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd664f31a-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': '11bcb67a8e8dd8902b3102042dbde151985b3aa6664e8aad61089403a3d5dccd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8-sda', 'timestamp': '2026-02-27T17:22:05.469015', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd664febe-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.861497462, 'message_signature': 'b6a2f59ec07bcbed93fcbe9f09540bb39a34296f71a29816bcd57569ea9491aa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1137, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-vda', 'timestamp': '2026-02-27T17:22:05.469015', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'd66508e6-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': '683b15ba43b488f4655e0af90f1984fd25bfc3d8a1bc1e855c4ac3e774553f87'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f-sda', 'timestamp': '2026-02-27T17:22:05.469015', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'd6651412-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.895612006, 'message_signature': 'e2cf86a2f5af0cde3c20b15ce0d416308bca76bddc2b7123d729c7d570bc64f0'}]}, 'timestamp': '2026-02-27 17:22:05.470143', '_unique_id': 'a8efb4dc7fd84fbe8701f2c58d6ca797'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.470 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.471 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.471 12 DEBUG ceilometer.compute.pollsters [-] 8f00a140-3d50-4972-b03d-28a0e12800c8/memory.usage volume: 42.7265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.471 12 DEBUG ceilometer.compute.pollsters [-] 76796d65-1ae6-49b8-a27c-99e9f3e98d8f/memory.usage volume: 42.83203125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cdccf4d-e4e5-416d-a015-4e06b9a26738', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.7265625, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'timestamp': '2026-02-27T17:22:05.471696', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1090587586', 'name': 'instance-0000000c', 'instance_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd6655be8-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.976349933, 'message_signature': '8742a3d71610c9af94f443a7b97b75ed05c93b7518a8360c095eb9b0105c70cc'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.83203125, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_name': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_name': None, 'resource_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'timestamp': '2026-02-27T17:22:05.471696', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1377424799', 'name': 'instance-0000000b', 'instance_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'instance_type': 'm1.nano', 'host': 'a607a180c99aa7b47fe111de26887b9f259c0886cbbe6930275f2b3c', 'instance_host': 'compute-0.ctlplane.example.com', 'flavor': {'id': 'a21147e3-c734-4efb-8cc1-463f16e819cd', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}, 'image_ref': 'b49463d5-90a4-4c27-9dac-a140f152eabc', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'd6656782-1400-11f1-ae47-fa163e685992', 'monotonic_time': 3800.990565465, 'message_signature': 'bce64c493c6565e29a621f055a546c296f2d6984a49869475e263e7622121973'}]}, 'timestamp': '2026-02-27 17:22:05.472304', '_unique_id': 'bbc6ecf18b334398be9c52e45b093453'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     yield
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 27 17:22:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:22:05.472 12 ERROR oslo_messaging.notify.messaging 
Feb 27 17:22:05 compute-0 nova_compute[186840]: 2026-02-27 17:22:05.616 186844 DEBUG nova.compute.manager [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-changed-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:05 compute-0 nova_compute[186840]: 2026-02-27 17:22:05.616 186844 DEBUG nova.compute.manager [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Refreshing instance network info cache due to event network-changed-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:22:05 compute-0 nova_compute[186840]: 2026-02-27 17:22:05.616 186844 DEBUG oslo_concurrency.lockutils [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:22:05 compute-0 nova_compute[186840]: 2026-02-27 17:22:05.616 186844 DEBUG oslo_concurrency.lockutils [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:22:05 compute-0 nova_compute[186840]: 2026-02-27 17:22:05.617 186844 DEBUG nova.network.neutron [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Refreshing network info cache for port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:22:06 compute-0 nova_compute[186840]: 2026-02-27 17:22:06.023 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:07 compute-0 nova_compute[186840]: 2026-02-27 17:22:07.399 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.109 186844 DEBUG nova.compute.manager [req-8303348f-4199-4cd6-b901-8180ba8535cf req-3b9aadfc-8904-4dad-9744-d329e4bdab4e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Received event network-changed-ebac04fa-73a7-4be5-b486-8b87c3270448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.110 186844 DEBUG nova.compute.manager [req-8303348f-4199-4cd6-b901-8180ba8535cf req-3b9aadfc-8904-4dad-9744-d329e4bdab4e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Refreshing instance network info cache due to event network-changed-ebac04fa-73a7-4be5-b486-8b87c3270448. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.110 186844 DEBUG oslo_concurrency.lockutils [req-8303348f-4199-4cd6-b901-8180ba8535cf req-3b9aadfc-8904-4dad-9744-d329e4bdab4e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.111 186844 DEBUG oslo_concurrency.lockutils [req-8303348f-4199-4cd6-b901-8180ba8535cf req-3b9aadfc-8904-4dad-9744-d329e4bdab4e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.111 186844 DEBUG nova.network.neutron [req-8303348f-4199-4cd6-b901-8180ba8535cf req-3b9aadfc-8904-4dad-9744-d329e4bdab4e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Refreshing network info cache for port ebac04fa-73a7-4be5-b486-8b87c3270448 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.203 186844 DEBUG oslo_concurrency.lockutils [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "8f00a140-3d50-4972-b03d-28a0e12800c8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.204 186844 DEBUG oslo_concurrency.lockutils [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.204 186844 DEBUG oslo_concurrency.lockutils [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.205 186844 DEBUG oslo_concurrency.lockutils [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.205 186844 DEBUG oslo_concurrency.lockutils [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.207 186844 INFO nova.compute.manager [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Terminating instance
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.209 186844 DEBUG nova.compute.manager [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:22:08 compute-0 kernel: tapebac04fa-73 (unregistering): left promiscuous mode
Feb 27 17:22:08 compute-0 NetworkManager[56537]: <info>  [1772212928.2420] device (tapebac04fa-73): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:22:08 compute-0 ovn_controller[96756]: 2026-02-27T17:22:08Z|00155|binding|INFO|Releasing lport ebac04fa-73a7-4be5-b486-8b87c3270448 from this chassis (sb_readonly=0)
Feb 27 17:22:08 compute-0 ovn_controller[96756]: 2026-02-27T17:22:08Z|00156|binding|INFO|Setting lport ebac04fa-73a7-4be5-b486-8b87c3270448 down in Southbound
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.253 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:08 compute-0 ovn_controller[96756]: 2026-02-27T17:22:08Z|00157|binding|INFO|Removing iface tapebac04fa-73 ovn-installed in OVS
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.258 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.264 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:32:df:80 10.100.0.13'], port_security=['fa:16:3e:32:df:80 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8f00a140-3d50-4972-b03d-28a0e12800c8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b3452a48-3d12-4bdb-ba02-d5fde1af6478', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5da8242-881c-4f4b-b09e-baa93811cec7, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=ebac04fa-73a7-4be5-b486-8b87c3270448) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.266 106085 INFO neutron.agent.ovn.metadata.agent [-] Port ebac04fa-73a7-4be5-b486-8b87c3270448 in datapath f9adecf3-bded-4f54-b5e0-7e0c3564bf2a unbound from our chassis
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.266 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.268 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f9adecf3-bded-4f54-b5e0-7e0c3564bf2a
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.283 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[caaa65eb-da42-43ad-ad96-96f837230c05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:08 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Feb 27 17:22:08 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000000c.scope: Consumed 12.012s CPU time.
Feb 27 17:22:08 compute-0 systemd-machined[156136]: Machine qemu-12-instance-0000000c terminated.
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.317 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[d84a089c-d44e-43f0-ab60-a7e6eaccd23e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.321 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[ce60daf3-26a8-4958-a50a-9d21ee1e115e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.354 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[7df1c343-7a56-4c16-b2b1-55a36082efc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.370 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[3718631e-6439-4caa-8a05-47f6dc2f8660]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf9adecf3-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:6e:f0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 8, 'rx_bytes': 700, 'tx_bytes': 524, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 44], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375578, 'reachable_time': 23085, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220911, 'error': None, 'target': 'ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.384 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[4fab08e2-187a-4132-b6bf-6201031302e1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapf9adecf3-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375586, 'tstamp': 375586}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220912, 'error': None, 'target': 'ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapf9adecf3-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 375589, 'tstamp': 375589}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220912, 'error': None, 'target': 'ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.386 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9adecf3-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.388 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.392 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.393 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf9adecf3-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.393 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.394 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf9adecf3-b0, col_values=(('external_ids', {'iface-id': '528129d5-a74d-4559-b145-2d5af576200c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:08 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:08.395 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.464 186844 INFO nova.virt.libvirt.driver [-] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Instance destroyed successfully.
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.464 186844 DEBUG nova.objects.instance [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid 8f00a140-3d50-4972-b03d-28a0e12800c8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.477 186844 DEBUG nova.virt.libvirt.vif [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:21:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1090587586',display_name='tempest-TestNetworkBasicOps-server-1090587586',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1090587586',id=12,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBENlNHBW2+Q5G7+YcIic0pFABdrSs8/tx6oq/c4459Blm4jV0iJ8gdIDrvawoG2LeCANWjKTh1pRvxpwMa8STXngLk0QWms6S10pber5A7UFowOroastyEwgDpMm42OA0Q==',key_name='tempest-TestNetworkBasicOps-1020147262',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:21:39Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-ms4byc60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:21:39Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=8f00a140-3d50-4972-b03d-28a0e12800c8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.477 186844 DEBUG nova.network.os_vif_util [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.478 186844 DEBUG nova.network.os_vif_util [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:32:df:80,bridge_name='br-int',has_traffic_filtering=True,id=ebac04fa-73a7-4be5-b486-8b87c3270448,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebac04fa-73') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.479 186844 DEBUG os_vif [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:df:80,bridge_name='br-int',has_traffic_filtering=True,id=ebac04fa-73a7-4be5-b486-8b87c3270448,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebac04fa-73') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.481 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.482 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapebac04fa-73, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.484 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.486 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.489 186844 INFO os_vif [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:32:df:80,bridge_name='br-int',has_traffic_filtering=True,id=ebac04fa-73a7-4be5-b486-8b87c3270448,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapebac04fa-73')
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.490 186844 INFO nova.virt.libvirt.driver [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Deleting instance files /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8_del
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.491 186844 INFO nova.virt.libvirt.driver [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Deletion of /var/lib/nova/instances/8f00a140-3d50-4972-b03d-28a0e12800c8_del complete
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.578 186844 INFO nova.compute.manager [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Took 0.37 seconds to destroy the instance on the hypervisor.
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.579 186844 DEBUG oslo.service.loopingcall [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.580 186844 DEBUG nova.compute.manager [-] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:22:08 compute-0 nova_compute[186840]: 2026-02-27 17:22:08.580 186844 DEBUG nova.network.neutron [-] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.021 186844 DEBUG nova.network.neutron [-] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.064 186844 INFO nova.compute.manager [-] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Took 1.48 seconds to deallocate network for instance.
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.114 186844 DEBUG nova.network.neutron [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updated VIF entry in instance network info cache for port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.115 186844 DEBUG nova.network.neutron [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updating instance_info_cache with network_info: [{"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.161 186844 DEBUG oslo_concurrency.lockutils [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.162 186844 DEBUG oslo_concurrency.lockutils [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.178 186844 DEBUG oslo_concurrency.lockutils [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.179 186844 DEBUG nova.compute.manager [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.179 186844 DEBUG oslo_concurrency.lockutils [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.179 186844 DEBUG oslo_concurrency.lockutils [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.180 186844 DEBUG oslo_concurrency.lockutils [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.180 186844 DEBUG nova.compute.manager [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] No waiting events found dispatching network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.180 186844 WARNING nova.compute.manager [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received unexpected event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb for instance with vm_state active and task_state None.
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.181 186844 DEBUG nova.compute.manager [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.181 186844 DEBUG oslo_concurrency.lockutils [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.182 186844 DEBUG oslo_concurrency.lockutils [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.182 186844 DEBUG oslo_concurrency.lockutils [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.183 186844 DEBUG nova.compute.manager [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] No waiting events found dispatching network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.183 186844 WARNING nova.compute.manager [req-639d3d01-d34c-4f07-aa57-8bdb5fd7cdba req-039070b4-cea4-4819-b916-29d95e793488 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received unexpected event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb for instance with vm_state active and task_state None.
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.189 186844 DEBUG nova.compute.manager [req-4523cf82-652b-42b6-b4e8-3baddf177baa req-d1d0e07c-a046-4803-bbb9-621f1d0f5cdc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Received event network-vif-plugged-ebac04fa-73a7-4be5-b486-8b87c3270448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.189 186844 DEBUG oslo_concurrency.lockutils [req-4523cf82-652b-42b6-b4e8-3baddf177baa req-d1d0e07c-a046-4803-bbb9-621f1d0f5cdc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.190 186844 DEBUG oslo_concurrency.lockutils [req-4523cf82-652b-42b6-b4e8-3baddf177baa req-d1d0e07c-a046-4803-bbb9-621f1d0f5cdc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.190 186844 DEBUG oslo_concurrency.lockutils [req-4523cf82-652b-42b6-b4e8-3baddf177baa req-d1d0e07c-a046-4803-bbb9-621f1d0f5cdc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.190 186844 DEBUG nova.compute.manager [req-4523cf82-652b-42b6-b4e8-3baddf177baa req-d1d0e07c-a046-4803-bbb9-621f1d0f5cdc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] No waiting events found dispatching network-vif-plugged-ebac04fa-73a7-4be5-b486-8b87c3270448 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.191 186844 WARNING nova.compute.manager [req-4523cf82-652b-42b6-b4e8-3baddf177baa req-d1d0e07c-a046-4803-bbb9-621f1d0f5cdc 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Received unexpected event network-vif-plugged-ebac04fa-73a7-4be5-b486-8b87c3270448 for instance with vm_state deleted and task_state None.
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.277 186844 DEBUG nova.compute.provider_tree [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.312 186844 DEBUG nova.compute.manager [req-c9e64504-91b9-4004-92d1-a49716fcf12c req-9a483c40-5533-4759-acea-2e13fa4018bd 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Received event network-vif-deleted-ebac04fa-73a7-4be5-b486-8b87c3270448 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.325 186844 DEBUG nova.scheduler.client.report [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.399 186844 DEBUG oslo_concurrency.lockutils [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.432 186844 INFO nova.scheduler.client.report [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance 8f00a140-3d50-4972-b03d-28a0e12800c8
Feb 27 17:22:10 compute-0 nova_compute[186840]: 2026-02-27 17:22:10.539 186844 DEBUG oslo_concurrency.lockutils [None req-c11f1174-5575-4506-8ab0-8b6099d5021a 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8f00a140-3d50-4972-b03d-28a0e12800c8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.335s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:11 compute-0 nova_compute[186840]: 2026-02-27 17:22:11.074 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:11 compute-0 nova_compute[186840]: 2026-02-27 17:22:11.167 186844 DEBUG nova.network.neutron [req-8303348f-4199-4cd6-b901-8180ba8535cf req-3b9aadfc-8904-4dad-9744-d329e4bdab4e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Updated VIF entry in instance network info cache for port ebac04fa-73a7-4be5-b486-8b87c3270448. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:22:11 compute-0 nova_compute[186840]: 2026-02-27 17:22:11.168 186844 DEBUG nova.network.neutron [req-8303348f-4199-4cd6-b901-8180ba8535cf req-3b9aadfc-8904-4dad-9744-d329e4bdab4e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Updating instance_info_cache with network_info: [{"id": "ebac04fa-73a7-4be5-b486-8b87c3270448", "address": "fa:16:3e:32:df:80", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapebac04fa-73", "ovs_interfaceid": "ebac04fa-73a7-4be5-b486-8b87c3270448", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:22:11 compute-0 nova_compute[186840]: 2026-02-27 17:22:11.199 186844 DEBUG oslo_concurrency.lockutils [req-8303348f-4199-4cd6-b901-8180ba8535cf req-3b9aadfc-8904-4dad-9744-d329e4bdab4e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-8f00a140-3d50-4972-b03d-28a0e12800c8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.316 186844 DEBUG nova.compute.manager [req-c0693e09-5983-4cf6-b2bd-98d6322da409 req-f14bcb83-dffa-4d89-9d9f-65807f039141 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-changed-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.316 186844 DEBUG nova.compute.manager [req-c0693e09-5983-4cf6-b2bd-98d6322da409 req-f14bcb83-dffa-4d89-9d9f-65807f039141 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Refreshing instance network info cache due to event network-changed-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.317 186844 DEBUG oslo_concurrency.lockutils [req-c0693e09-5983-4cf6-b2bd-98d6322da409 req-f14bcb83-dffa-4d89-9d9f-65807f039141 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.317 186844 DEBUG oslo_concurrency.lockutils [req-c0693e09-5983-4cf6-b2bd-98d6322da409 req-f14bcb83-dffa-4d89-9d9f-65807f039141 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.317 186844 DEBUG nova.network.neutron [req-c0693e09-5983-4cf6-b2bd-98d6322da409 req-f14bcb83-dffa-4d89-9d9f-65807f039141 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Refreshing network info cache for port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.398 186844 DEBUG oslo_concurrency.lockutils [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.399 186844 DEBUG oslo_concurrency.lockutils [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.400 186844 DEBUG oslo_concurrency.lockutils [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.400 186844 DEBUG oslo_concurrency.lockutils [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.400 186844 DEBUG oslo_concurrency.lockutils [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.402 186844 INFO nova.compute.manager [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Terminating instance
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.404 186844 DEBUG nova.compute.manager [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:22:13 compute-0 kernel: tap3e762eca-ff (unregistering): left promiscuous mode
Feb 27 17:22:13 compute-0 NetworkManager[56537]: <info>  [1772212933.4314] device (tap3e762eca-ff): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.431 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:13 compute-0 ovn_controller[96756]: 2026-02-27T17:22:13Z|00158|binding|INFO|Releasing lport 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb from this chassis (sb_readonly=0)
Feb 27 17:22:13 compute-0 ovn_controller[96756]: 2026-02-27T17:22:13Z|00159|binding|INFO|Setting lport 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb down in Southbound
Feb 27 17:22:13 compute-0 ovn_controller[96756]: 2026-02-27T17:22:13Z|00160|binding|INFO|Removing iface tap3e762eca-ff ovn-installed in OVS
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.439 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.447 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:bf:ef 10.100.0.3'], port_security=['fa:16:3e:65:bf:ef 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '76796d65-1ae6-49b8-a27c-99e9f3e98d8f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'fd5bbb1b-a90c-4d23-afdc-bfe41a6e60ad', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5da8242-881c-4f4b-b09e-baa93811cec7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=3e762eca-ffd0-4082-a5a7-bd3e3880d6fb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.449 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb in datapath f9adecf3-bded-4f54-b5e0-7e0c3564bf2a unbound from our chassis
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.450 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f9adecf3-bded-4f54-b5e0-7e0c3564bf2a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.451 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.451 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[8b180947-5ce3-4843-bd97-2f017affbae5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.452 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a namespace which is not needed anymore
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.484 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:13 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Feb 27 17:22:13 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000b.scope: Consumed 13.015s CPU time.
Feb 27 17:22:13 compute-0 systemd-machined[156136]: Machine qemu-11-instance-0000000b terminated.
Feb 27 17:22:13 compute-0 neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a[220609]: [NOTICE]   (220613) : haproxy version is 2.8.14-c23fe91
Feb 27 17:22:13 compute-0 neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a[220609]: [NOTICE]   (220613) : path to executable is /usr/sbin/haproxy
Feb 27 17:22:13 compute-0 neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a[220609]: [WARNING]  (220613) : Exiting Master process...
Feb 27 17:22:13 compute-0 neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a[220609]: [ALERT]    (220613) : Current worker (220615) exited with code 143 (Terminated)
Feb 27 17:22:13 compute-0 neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a[220609]: [WARNING]  (220613) : All workers exited. Exiting... (0)
Feb 27 17:22:13 compute-0 systemd[1]: libpod-457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4.scope: Deactivated successfully.
Feb 27 17:22:13 compute-0 podman[220955]: 2026-02-27 17:22:13.624677509 +0000 UTC m=+0.049116226 container died 457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260223)
Feb 27 17:22:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4-userdata-shm.mount: Deactivated successfully.
Feb 27 17:22:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-15c4b5caa68f6c3a2727ea18a10b879d21b7a96036422b43dcdb4b724348c2a8-merged.mount: Deactivated successfully.
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.665 186844 INFO nova.virt.libvirt.driver [-] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Instance destroyed successfully.
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.666 186844 DEBUG nova.objects.instance [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid 76796d65-1ae6-49b8-a27c-99e9f3e98d8f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:22:13 compute-0 podman[220955]: 2026-02-27 17:22:13.672559154 +0000 UTC m=+0.096997831 container cleanup 457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.672 186844 DEBUG nova.compute.manager [req-941f2f0b-43a2-490d-9709-5b410829d4e9 req-b88d71d1-6922-4a28-9841-8e0f8bf99b46 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-vif-unplugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.673 186844 DEBUG oslo_concurrency.lockutils [req-941f2f0b-43a2-490d-9709-5b410829d4e9 req-b88d71d1-6922-4a28-9841-8e0f8bf99b46 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.673 186844 DEBUG oslo_concurrency.lockutils [req-941f2f0b-43a2-490d-9709-5b410829d4e9 req-b88d71d1-6922-4a28-9841-8e0f8bf99b46 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.673 186844 DEBUG oslo_concurrency.lockutils [req-941f2f0b-43a2-490d-9709-5b410829d4e9 req-b88d71d1-6922-4a28-9841-8e0f8bf99b46 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.674 186844 DEBUG nova.compute.manager [req-941f2f0b-43a2-490d-9709-5b410829d4e9 req-b88d71d1-6922-4a28-9841-8e0f8bf99b46 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] No waiting events found dispatching network-vif-unplugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.674 186844 DEBUG nova.compute.manager [req-941f2f0b-43a2-490d-9709-5b410829d4e9 req-b88d71d1-6922-4a28-9841-8e0f8bf99b46 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-vif-unplugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 27 17:22:13 compute-0 systemd[1]: libpod-conmon-457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4.scope: Deactivated successfully.
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.691 186844 DEBUG nova.virt.libvirt.vif [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:21:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1377424799',display_name='tempest-TestNetworkBasicOps-server-1377424799',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1377424799',id=11,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB5yMj7rv2FoFn43N91mInOFoiaVjdK+YUgcyY4/JaxhlF+sTH28XfB7CGBYhg4B37Cdd0aIrCrJzciXqQQtOxDwn9R4t3S5eghkg4mO9lFmSAyX4zi+cmINV24rq34ZTA==',key_name='tempest-TestNetworkBasicOps-287151659',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:21:20Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-2d6jv0aw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:21:20Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=76796d65-1ae6-49b8-a27c-99e9f3e98d8f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.692 186844 DEBUG nova.network.os_vif_util [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.693 186844 DEBUG nova.network.os_vif_util [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:65:bf:ef,bridge_name='br-int',has_traffic_filtering=True,id=3e762eca-ffd0-4082-a5a7-bd3e3880d6fb,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e762eca-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.694 186844 DEBUG os_vif [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:bf:ef,bridge_name='br-int',has_traffic_filtering=True,id=3e762eca-ffd0-4082-a5a7-bd3e3880d6fb,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e762eca-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.696 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.696 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3e762eca-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.700 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.702 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.703 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.706 186844 INFO os_vif [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:65:bf:ef,bridge_name='br-int',has_traffic_filtering=True,id=3e762eca-ffd0-4082-a5a7-bd3e3880d6fb,network=Network(f9adecf3-bded-4f54-b5e0-7e0c3564bf2a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3e762eca-ff')
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.707 186844 INFO nova.virt.libvirt.driver [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Deleting instance files /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f_del
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.708 186844 INFO nova.virt.libvirt.driver [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Deletion of /var/lib/nova/instances/76796d65-1ae6-49b8-a27c-99e9f3e98d8f_del complete
Feb 27 17:22:13 compute-0 podman[221004]: 2026-02-27 17:22:13.744665477 +0000 UTC m=+0.048573802 container remove 457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.750 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[449ac1cb-699d-4ad3-8543-9fa84bce12f1]: (4, ('Fri Feb 27 05:22:13 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a (457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4)\n457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4\nFri Feb 27 05:22:13 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a (457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4)\n457d844b43c0868cd93308948927b35e31b288d9f4214636e7d6abdd286354b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.752 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[a891e504-a68c-4377-bf04-567a5c88bbe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.753 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf9adecf3-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.754 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.756 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.756 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:13 compute-0 kernel: tapf9adecf3-b0: left promiscuous mode
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.757 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.758 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.761 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6bb28b-131c-4824-b691-966a63c8637e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.763 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.774 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[ec0a100b-11e4-42df-b24d-f16f5e3a749c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.776 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[95ce5f75-9a10-4709-8e57-8c8ae9a18dd8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.789 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[a99f0d54-f2b3-403a-b6ad-bba32eb08dee]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 375572, 'reachable_time': 41627, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221021, 'error': None, 'target': 'ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:13 compute-0 systemd[1]: run-netns-ovnmeta\x2df9adecf3\x2dbded\x2d4f54\x2db5e0\x2d7e0c3564bf2a.mount: Deactivated successfully.
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.792 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f9adecf3-bded-4f54-b5e0-7e0c3564bf2a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:22:13 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:13.792 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[0bde7efb-ced7-4353-8a6e-61687949f740]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.813 186844 INFO nova.compute.manager [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Took 0.41 seconds to destroy the instance on the hypervisor.
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.815 186844 DEBUG oslo.service.loopingcall [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.816 186844 DEBUG nova.compute.manager [-] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.816 186844 DEBUG nova.network.neutron [-] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:22:13 compute-0 nova_compute[186840]: 2026-02-27 17:22:13.832 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Error from libvirt while getting description of instance-0000000b: [Error Code 42] Domain not found: no domain with matching uuid '76796d65-1ae6-49b8-a27c-99e9f3e98d8f' (instance-0000000b): libvirt.libvirtError: Domain not found: no domain with matching uuid '76796d65-1ae6-49b8-a27c-99e9f3e98d8f' (instance-0000000b)
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.024 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.026 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5759MB free_disk=73.19433212280273GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.027 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.028 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.239 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Instance 76796d65-1ae6-49b8-a27c-99e9f3e98d8f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.240 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.240 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.461 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.489 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.605 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.606 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.631 186844 DEBUG nova.network.neutron [-] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.660 186844 INFO nova.compute.manager [-] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Took 0.84 seconds to deallocate network for instance.
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.722 186844 DEBUG oslo_concurrency.lockutils [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.723 186844 DEBUG oslo_concurrency.lockutils [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.764 186844 DEBUG nova.network.neutron [req-c0693e09-5983-4cf6-b2bd-98d6322da409 req-f14bcb83-dffa-4d89-9d9f-65807f039141 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updated VIF entry in instance network info cache for port 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.766 186844 DEBUG nova.network.neutron [req-c0693e09-5983-4cf6-b2bd-98d6322da409 req-f14bcb83-dffa-4d89-9d9f-65807f039141 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Updating instance_info_cache with network_info: [{"id": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "address": "fa:16:3e:65:bf:ef", "network": {"id": "f9adecf3-bded-4f54-b5e0-7e0c3564bf2a", "bridge": "br-int", "label": "tempest-network-smoke--1713592475", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3e762eca-ff", "ovs_interfaceid": "3e762eca-ffd0-4082-a5a7-bd3e3880d6fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.773 186844 DEBUG nova.compute.provider_tree [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.805 186844 DEBUG nova.scheduler.client.report [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.814 186844 DEBUG oslo_concurrency.lockutils [req-c0693e09-5983-4cf6-b2bd-98d6322da409 req-f14bcb83-dffa-4d89-9d9f-65807f039141 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-76796d65-1ae6-49b8-a27c-99e9f3e98d8f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.840 186844 DEBUG oslo_concurrency.lockutils [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.883 186844 INFO nova.scheduler.client.report [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance 76796d65-1ae6-49b8-a27c-99e9f3e98d8f
Feb 27 17:22:14 compute-0 nova_compute[186840]: 2026-02-27 17:22:14.952 186844 DEBUG oslo_concurrency.lockutils [None req-09532911-89b7-4636-b8ee-53754a5df9cd 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.446 186844 DEBUG nova.compute.manager [req-1319abc2-93e3-49eb-a50f-f0ef0f832fe9 req-60ac209d-96b4-47a9-852b-942f99e0d2b9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-vif-deleted-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.446 186844 INFO nova.compute.manager [req-1319abc2-93e3-49eb-a50f-f0ef0f832fe9 req-60ac209d-96b4-47a9-852b-942f99e0d2b9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Neutron deleted interface 3e762eca-ffd0-4082-a5a7-bd3e3880d6fb; detaching it from the instance and deleting it from the info cache
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.447 186844 DEBUG nova.network.neutron [req-1319abc2-93e3-49eb-a50f-f0ef0f832fe9 req-60ac209d-96b4-47a9-852b-942f99e0d2b9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.449 186844 DEBUG nova.compute.manager [req-1319abc2-93e3-49eb-a50f-f0ef0f832fe9 req-60ac209d-96b4-47a9-852b-942f99e0d2b9 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Detach interface failed, port_id=3e762eca-ffd0-4082-a5a7-bd3e3880d6fb, reason: Instance 76796d65-1ae6-49b8-a27c-99e9f3e98d8f could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Feb 27 17:22:15 compute-0 podman[221022]: 2026-02-27 17:22:15.645883917 +0000 UTC m=+0.055072753 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:22:15 compute-0 podman[221023]: 2026-02-27 17:22:15.64601902 +0000 UTC m=+0.052794576 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.736 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.736 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.736 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.755 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.800 186844 DEBUG nova.compute.manager [req-7c229491-1a87-490e-8cf4-070adda7b00f req-79786314-285f-4307-8606-0d4dd2b69626 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.801 186844 DEBUG oslo_concurrency.lockutils [req-7c229491-1a87-490e-8cf4-070adda7b00f req-79786314-285f-4307-8606-0d4dd2b69626 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.802 186844 DEBUG oslo_concurrency.lockutils [req-7c229491-1a87-490e-8cf4-070adda7b00f req-79786314-285f-4307-8606-0d4dd2b69626 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.802 186844 DEBUG oslo_concurrency.lockutils [req-7c229491-1a87-490e-8cf4-070adda7b00f req-79786314-285f-4307-8606-0d4dd2b69626 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "76796d65-1ae6-49b8-a27c-99e9f3e98d8f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.802 186844 DEBUG nova.compute.manager [req-7c229491-1a87-490e-8cf4-070adda7b00f req-79786314-285f-4307-8606-0d4dd2b69626 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] No waiting events found dispatching network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:22:15 compute-0 nova_compute[186840]: 2026-02-27 17:22:15.803 186844 WARNING nova.compute.manager [req-7c229491-1a87-490e-8cf4-070adda7b00f req-79786314-285f-4307-8606-0d4dd2b69626 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Received unexpected event network-vif-plugged-3e762eca-ffd0-4082-a5a7-bd3e3880d6fb for instance with vm_state deleted and task_state None.
Feb 27 17:22:16 compute-0 nova_compute[186840]: 2026-02-27 17:22:16.109 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:16 compute-0 nova_compute[186840]: 2026-02-27 17:22:16.778 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:16 compute-0 nova_compute[186840]: 2026-02-27 17:22:16.779 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:22:16 compute-0 nova_compute[186840]: 2026-02-27 17:22:16.779 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:22:16 compute-0 nova_compute[186840]: 2026-02-27 17:22:16.801 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:22:18 compute-0 nova_compute[186840]: 2026-02-27 17:22:18.699 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:18 compute-0 nova_compute[186840]: 2026-02-27 17:22:18.926 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:18 compute-0 nova_compute[186840]: 2026-02-27 17:22:18.946 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:19 compute-0 nova_compute[186840]: 2026-02-27 17:22:19.717 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:20 compute-0 podman[221066]: 2026-02-27 17:22:20.684466711 +0000 UTC m=+0.081584709 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, release=1770267347, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=)
Feb 27 17:22:20 compute-0 nova_compute[186840]: 2026-02-27 17:22:20.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:20 compute-0 podman[221067]: 2026-02-27 17:22:20.708361012 +0000 UTC m=+0.105277545 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Feb 27 17:22:21 compute-0 nova_compute[186840]: 2026-02-27 17:22:21.111 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:21 compute-0 nova_compute[186840]: 2026-02-27 17:22:21.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:23 compute-0 nova_compute[186840]: 2026-02-27 17:22:23.463 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212928.461004, 8f00a140-3d50-4972-b03d-28a0e12800c8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:22:23 compute-0 nova_compute[186840]: 2026-02-27 17:22:23.464 186844 INFO nova.compute.manager [-] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] VM Stopped (Lifecycle Event)
Feb 27 17:22:23 compute-0 nova_compute[186840]: 2026-02-27 17:22:23.493 186844 DEBUG nova.compute.manager [None req-48de44ca-8b61-4bd4-a2d7-0a106317f93e - - - - - -] [instance: 8f00a140-3d50-4972-b03d-28a0e12800c8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:22:23 compute-0 nova_compute[186840]: 2026-02-27 17:22:23.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:22:23 compute-0 nova_compute[186840]: 2026-02-27 17:22:23.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:22:23 compute-0 nova_compute[186840]: 2026-02-27 17:22:23.700 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:26 compute-0 nova_compute[186840]: 2026-02-27 17:22:26.137 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:28 compute-0 nova_compute[186840]: 2026-02-27 17:22:28.664 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212933.6625454, 76796d65-1ae6-49b8-a27c-99e9f3e98d8f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:22:28 compute-0 nova_compute[186840]: 2026-02-27 17:22:28.665 186844 INFO nova.compute.manager [-] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] VM Stopped (Lifecycle Event)
Feb 27 17:22:28 compute-0 nova_compute[186840]: 2026-02-27 17:22:28.692 186844 DEBUG nova.compute.manager [None req-068a713f-89c7-4996-8d5d-7359ace18b5b - - - - - -] [instance: 76796d65-1ae6-49b8-a27c-99e9f3e98d8f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:22:28 compute-0 nova_compute[186840]: 2026-02-27 17:22:28.701 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:29 compute-0 podman[221113]: 2026-02-27 17:22:29.708508751 +0000 UTC m=+0.107374577 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.build-date=20260223)
Feb 27 17:22:31 compute-0 nova_compute[186840]: 2026-02-27 17:22:31.139 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:33 compute-0 nova_compute[186840]: 2026-02-27 17:22:33.703 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:34 compute-0 podman[221132]: 2026-02-27 17:22:34.668331098 +0000 UTC m=+0.070366521 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:22:36 compute-0 nova_compute[186840]: 2026-02-27 17:22:36.140 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:38 compute-0 nova_compute[186840]: 2026-02-27 17:22:38.706 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:41 compute-0 nova_compute[186840]: 2026-02-27 17:22:41.185 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:43 compute-0 nova_compute[186840]: 2026-02-27 17:22:43.474 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:43 compute-0 nova_compute[186840]: 2026-02-27 17:22:43.475 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:43 compute-0 nova_compute[186840]: 2026-02-27 17:22:43.525 186844 DEBUG nova.compute.manager [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 27 17:22:43 compute-0 nova_compute[186840]: 2026-02-27 17:22:43.655 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:43 compute-0 nova_compute[186840]: 2026-02-27 17:22:43.656 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:43 compute-0 nova_compute[186840]: 2026-02-27 17:22:43.667 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 27 17:22:43 compute-0 nova_compute[186840]: 2026-02-27 17:22:43.668 186844 INFO nova.compute.claims [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Claim successful on node compute-0.ctlplane.example.com
Feb 27 17:22:43 compute-0 nova_compute[186840]: 2026-02-27 17:22:43.730 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:43 compute-0 nova_compute[186840]: 2026-02-27 17:22:43.936 186844 DEBUG nova.compute.provider_tree [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.004 186844 DEBUG nova.scheduler.client.report [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.157 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.501s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.158 186844 DEBUG nova.compute.manager [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.439 186844 DEBUG nova.compute.manager [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.440 186844 DEBUG nova.network.neutron [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.490 186844 INFO nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.568 186844 DEBUG nova.compute.manager [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.714 186844 DEBUG nova.compute.manager [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.716 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.717 186844 INFO nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Creating image(s)
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.718 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "/var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.719 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.721 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "/var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.749 186844 DEBUG nova.policy [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '427d6e526715473ebe8997007bbff5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0922444e0aaf445884a7c2fa20793b1f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.753 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.826 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.828 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "98b35408cc9e37756bf5961408cc96a7885fb22d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.829 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.851 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.927 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.928 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.969 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d,backing_fmt=raw /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.970 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "98b35408cc9e37756bf5961408cc96a7885fb22d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:44 compute-0 nova_compute[186840]: 2026-02-27 17:22:44.971 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:22:45 compute-0 nova_compute[186840]: 2026-02-27 17:22:45.046 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/98b35408cc9e37756bf5961408cc96a7885fb22d --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:22:45 compute-0 nova_compute[186840]: 2026-02-27 17:22:45.048 186844 DEBUG nova.virt.disk.api [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Checking if we can resize image /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:180
Feb 27 17:22:45 compute-0 nova_compute[186840]: 2026-02-27 17:22:45.049 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:22:45 compute-0 nova_compute[186840]: 2026-02-27 17:22:45.093 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk --force-share --output=json" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:22:45 compute-0 nova_compute[186840]: 2026-02-27 17:22:45.094 186844 DEBUG nova.virt.disk.api [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Cannot resize image /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:186
Feb 27 17:22:45 compute-0 nova_compute[186840]: 2026-02-27 17:22:45.095 186844 DEBUG nova.objects.instance [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'migration_context' on Instance uuid 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:22:45 compute-0 nova_compute[186840]: 2026-02-27 17:22:45.218 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 27 17:22:45 compute-0 nova_compute[186840]: 2026-02-27 17:22:45.218 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Ensure instance console log exists: /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 27 17:22:45 compute-0 nova_compute[186840]: 2026-02-27 17:22:45.219 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:45 compute-0 nova_compute[186840]: 2026-02-27 17:22:45.219 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:45 compute-0 nova_compute[186840]: 2026-02-27 17:22:45.219 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:46 compute-0 nova_compute[186840]: 2026-02-27 17:22:46.188 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:46 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:46.511 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:22:46 compute-0 nova_compute[186840]: 2026-02-27 17:22:46.512 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:46 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:46.513 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:22:46 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:46.514 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:46 compute-0 podman[221173]: 2026-02-27 17:22:46.660448107 +0000 UTC m=+0.066776793 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:22:46 compute-0 podman[221174]: 2026-02-27 17:22:46.695204457 +0000 UTC m=+0.096126959 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 27 17:22:46 compute-0 nova_compute[186840]: 2026-02-27 17:22:46.861 186844 DEBUG nova.network.neutron [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Successfully created port: 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Feb 27 17:22:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:47.095 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:47.096 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:47.096 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:48 compute-0 nova_compute[186840]: 2026-02-27 17:22:48.735 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:50 compute-0 nova_compute[186840]: 2026-02-27 17:22:50.013 186844 DEBUG nova.network.neutron [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Successfully updated port: 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 27 17:22:50 compute-0 nova_compute[186840]: 2026-02-27 17:22:50.144 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:22:50 compute-0 nova_compute[186840]: 2026-02-27 17:22:50.144 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquired lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:22:50 compute-0 nova_compute[186840]: 2026-02-27 17:22:50.145 186844 DEBUG nova.network.neutron [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 27 17:22:50 compute-0 nova_compute[186840]: 2026-02-27 17:22:50.265 186844 DEBUG nova.compute.manager [req-5f426164-95dd-4a46-93f8-51bf63c12f6f req-a6dba021-88e5-4bc3-b229-2a3989f74200 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Received event network-changed-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:50 compute-0 nova_compute[186840]: 2026-02-27 17:22:50.265 186844 DEBUG nova.compute.manager [req-5f426164-95dd-4a46-93f8-51bf63c12f6f req-a6dba021-88e5-4bc3-b229-2a3989f74200 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Refreshing instance network info cache due to event network-changed-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:22:50 compute-0 nova_compute[186840]: 2026-02-27 17:22:50.266 186844 DEBUG oslo_concurrency.lockutils [req-5f426164-95dd-4a46-93f8-51bf63c12f6f req-a6dba021-88e5-4bc3-b229-2a3989f74200 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:22:51 compute-0 nova_compute[186840]: 2026-02-27 17:22:51.050 186844 DEBUG nova.network.neutron [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 27 17:22:51 compute-0 nova_compute[186840]: 2026-02-27 17:22:51.189 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:51 compute-0 podman[221213]: 2026-02-27 17:22:51.669356827 +0000 UTC m=+0.074775921 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, architecture=x86_64, build-date=2026-02-05T04:57:10Z, distribution-scope=public, managed_by=edpm_ansible, release=1770267347, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vcs-type=git, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 27 17:22:51 compute-0 podman[221214]: 2026-02-27 17:22:51.694355256 +0000 UTC m=+0.092975871 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.649 186844 DEBUG nova.network.neutron [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Updating instance_info_cache with network_info: [{"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.694 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Releasing lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.695 186844 DEBUG nova.compute.manager [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Instance network_info: |[{"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.695 186844 DEBUG oslo_concurrency.lockutils [req-5f426164-95dd-4a46-93f8-51bf63c12f6f req-a6dba021-88e5-4bc3-b229-2a3989f74200 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.695 186844 DEBUG nova.network.neutron [req-5f426164-95dd-4a46-93f8-51bf63c12f6f req-a6dba021-88e5-4bc3-b229-2a3989f74200 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Refreshing network info cache for port 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.698 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Start _get_guest_xml network_info=[{"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'encryption_options': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'device_type': 'disk', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'guest_format': None, 'encrypted': False, 'image_id': 'b49463d5-90a4-4c27-9dac-a140f152eabc'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.703 186844 WARNING nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.709 186844 DEBUG nova.virt.libvirt.host [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.710 186844 DEBUG nova.virt.libvirt.host [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.713 186844 DEBUG nova.virt.libvirt.host [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.714 186844 DEBUG nova.virt.libvirt.host [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.715 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.715 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-27T17:11:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='a21147e3-c734-4efb-8cc1-463f16e819cd',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-27T17:11:33Z,direct_url=<?>,disk_format='qcow2',id=b49463d5-90a4-4c27-9dac-a140f152eabc,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='2d107a7a633145b79e8da4348c7f2d65',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-27T17:11:35Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.716 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.716 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.717 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.717 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.718 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.718 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.719 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.719 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.719 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.720 186844 DEBUG nova.virt.hardware [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.726 186844 DEBUG nova.virt.libvirt.vif [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:22:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1111984263',display_name='tempest-TestNetworkBasicOps-server-1111984263',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1111984263',id=13,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCW+Vf+oKAa64TcEAgR3HI/FwInYpuSVHJb+s+I6+u9Y/HVVI0fBCHHm9GR9LBKuAFGhZ5mcvJZZVxW2QQy1WRlFtIxpi9P82osGwr/hCljitYSHR4iscYB/oMQOoJ9ssg==',key_name='tempest-TestNetworkBasicOps-53899880',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-k7ja1d76',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:22:44Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.726 186844 DEBUG nova.network.os_vif_util [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.727 186844 DEBUG nova.network.os_vif_util [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:f8:04,bridge_name='br-int',has_traffic_filtering=True,id=1044f5b2-3edd-4bb1-9c19-99ebe55c99ab,network=Network(69a995b1-fab3-4631-a4c9-73f23854e64d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1044f5b2-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.729 186844 DEBUG nova.objects.instance [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'pci_devices' on Instance uuid 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.802 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] End _get_guest_xml xml=<domain type="kvm">
Feb 27 17:22:52 compute-0 nova_compute[186840]:   <uuid>8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1</uuid>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   <name>instance-0000000d</name>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   <memory>131072</memory>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   <vcpu>1</vcpu>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   <metadata>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <nova:package version="27.5.2-0.20260220085704.5cfeecb.el9"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <nova:name>tempest-TestNetworkBasicOps-server-1111984263</nova:name>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <nova:creationTime>2026-02-27 17:22:52</nova:creationTime>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <nova:flavor name="m1.nano">
Feb 27 17:22:52 compute-0 nova_compute[186840]:         <nova:memory>128</nova:memory>
Feb 27 17:22:52 compute-0 nova_compute[186840]:         <nova:disk>1</nova:disk>
Feb 27 17:22:52 compute-0 nova_compute[186840]:         <nova:swap>0</nova:swap>
Feb 27 17:22:52 compute-0 nova_compute[186840]:         <nova:ephemeral>0</nova:ephemeral>
Feb 27 17:22:52 compute-0 nova_compute[186840]:         <nova:vcpus>1</nova:vcpus>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       </nova:flavor>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <nova:owner>
Feb 27 17:22:52 compute-0 nova_compute[186840]:         <nova:user uuid="427d6e526715473ebe8997007bbff5cd">tempest-TestNetworkBasicOps-1859516505-project-member</nova:user>
Feb 27 17:22:52 compute-0 nova_compute[186840]:         <nova:project uuid="0922444e0aaf445884a7c2fa20793b1f">tempest-TestNetworkBasicOps-1859516505</nova:project>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       </nova:owner>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <nova:root type="image" uuid="b49463d5-90a4-4c27-9dac-a140f152eabc"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <nova:ports>
Feb 27 17:22:52 compute-0 nova_compute[186840]:         <nova:port uuid="1044f5b2-3edd-4bb1-9c19-99ebe55c99ab">
Feb 27 17:22:52 compute-0 nova_compute[186840]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:         </nova:port>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       </nova:ports>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     </nova:instance>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   </metadata>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   <sysinfo type="smbios">
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <system>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <entry name="manufacturer">RDO</entry>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <entry name="product">OpenStack Compute</entry>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <entry name="version">27.5.2-0.20260220085704.5cfeecb.el9</entry>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <entry name="serial">8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1</entry>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <entry name="uuid">8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1</entry>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <entry name="family">Virtual Machine</entry>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     </system>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   </sysinfo>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   <os>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <boot dev="hd"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <smbios mode="sysinfo"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   </os>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   <features>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <acpi/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <apic/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <vmcoreinfo/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   </features>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   <clock offset="utc">
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <timer name="pit" tickpolicy="delay"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <timer name="hpet" present="no"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   </clock>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   <cpu mode="host-model" match="exact">
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <topology sockets="1" cores="1" threads="1"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   </cpu>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   <devices>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <disk type="file" device="disk">
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <driver name="qemu" type="qcow2" cache="none"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <target dev="vda" bus="virtio"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <disk type="file" device="cdrom">
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <driver name="qemu" type="raw" cache="none"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <source file="/var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk.config"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <target dev="sda" bus="sata"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     </disk>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <interface type="ethernet">
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <mac address="fa:16:3e:7b:f8:04"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <driver name="vhost" rx_queue_size="512"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <mtu size="1442"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <target dev="tap1044f5b2-3e"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     </interface>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <serial type="pty">
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <log file="/var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/console.log" append="off"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     </serial>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <video>
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <model type="virtio"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     </video>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <input type="tablet" bus="usb"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <rng model="virtio">
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <backend model="random">/dev/urandom</backend>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     </rng>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="pci" model="pcie-root-port"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <controller type="usb" index="0"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     <memballoon model="virtio">
Feb 27 17:22:52 compute-0 nova_compute[186840]:       <stats period="10"/>
Feb 27 17:22:52 compute-0 nova_compute[186840]:     </memballoon>
Feb 27 17:22:52 compute-0 nova_compute[186840]:   </devices>
Feb 27 17:22:52 compute-0 nova_compute[186840]: </domain>
Feb 27 17:22:52 compute-0 nova_compute[186840]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.803 186844 DEBUG nova.compute.manager [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Preparing to wait for external event network-vif-plugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.803 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.804 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.804 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.805 186844 DEBUG nova.virt.libvirt.vif [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-27T17:22:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1111984263',display_name='tempest-TestNetworkBasicOps-server-1111984263',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1111984263',id=13,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCW+Vf+oKAa64TcEAgR3HI/FwInYpuSVHJb+s+I6+u9Y/HVVI0fBCHHm9GR9LBKuAFGhZ5mcvJZZVxW2QQy1WRlFtIxpi9P82osGwr/hCljitYSHR4iscYB/oMQOoJ9ssg==',key_name='tempest-TestNetworkBasicOps-53899880',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-k7ja1d76',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-27T17:22:44Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.805 186844 DEBUG nova.network.os_vif_util [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.806 186844 DEBUG nova.network.os_vif_util [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:f8:04,bridge_name='br-int',has_traffic_filtering=True,id=1044f5b2-3edd-4bb1-9c19-99ebe55c99ab,network=Network(69a995b1-fab3-4631-a4c9-73f23854e64d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1044f5b2-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.806 186844 DEBUG os_vif [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:f8:04,bridge_name='br-int',has_traffic_filtering=True,id=1044f5b2-3edd-4bb1-9c19-99ebe55c99ab,network=Network(69a995b1-fab3-4631-a4c9-73f23854e64d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1044f5b2-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.807 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.807 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.808 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.811 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.811 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1044f5b2-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.812 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1044f5b2-3e, col_values=(('external_ids', {'iface-id': '1044f5b2-3edd-4bb1-9c19-99ebe55c99ab', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:f8:04', 'vm-uuid': '8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.813 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:52 compute-0 NetworkManager[56537]: <info>  [1772212972.8147] manager: (tap1044f5b2-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.815 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.820 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.820 186844 INFO os_vif [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:f8:04,bridge_name='br-int',has_traffic_filtering=True,id=1044f5b2-3edd-4bb1-9c19-99ebe55c99ab,network=Network(69a995b1-fab3-4631-a4c9-73f23854e64d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1044f5b2-3e')
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.981 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.981 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.982 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] No VIF found with MAC fa:16:3e:7b:f8:04, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 27 17:22:52 compute-0 nova_compute[186840]: 2026-02-27 17:22:52.982 186844 INFO nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Using config drive
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.019 186844 INFO nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Creating config drive at /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk.config
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.023 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp7sz_m9xv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.147 186844 DEBUG oslo_concurrency.processutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260220085704.5cfeecb.el9 -quiet -J -r -V config-2 /tmp/tmp7sz_m9xv" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:22:54 compute-0 kernel: tap1044f5b2-3e: entered promiscuous mode
Feb 27 17:22:54 compute-0 NetworkManager[56537]: <info>  [1772212974.2100] manager: (tap1044f5b2-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Feb 27 17:22:54 compute-0 systemd-udevd[221277]: Network interface NamePolicy= disabled on kernel command line.
Feb 27 17:22:54 compute-0 ovn_controller[96756]: 2026-02-27T17:22:54Z|00161|binding|INFO|Claiming lport 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab for this chassis.
Feb 27 17:22:54 compute-0 ovn_controller[96756]: 2026-02-27T17:22:54Z|00162|binding|INFO|1044f5b2-3edd-4bb1-9c19-99ebe55c99ab: Claiming fa:16:3e:7b:f8:04 10.100.0.10
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.258 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.263 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:54 compute-0 NetworkManager[56537]: <info>  [1772212974.2754] device (tap1044f5b2-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Feb 27 17:22:54 compute-0 NetworkManager[56537]: <info>  [1772212974.2775] device (tap1044f5b2-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.286 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:f8:04 10.100.0.10'], port_security=['fa:16:3e:7b:f8:04 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69a995b1-fab3-4631-a4c9-73f23854e64d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a4e352ed-bf10-46cf-a75f-cabcd828d88c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed5c5633-2663-423a-8cd1-8fc77b585b41, chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=1044f5b2-3edd-4bb1-9c19-99ebe55c99ab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.289 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab in datapath 69a995b1-fab3-4631-a4c9-73f23854e64d bound to our chassis
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.291 106085 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 69a995b1-fab3-4631-a4c9-73f23854e64d
Feb 27 17:22:54 compute-0 systemd-machined[156136]: New machine qemu-13-instance-0000000d.
Feb 27 17:22:54 compute-0 ovn_controller[96756]: 2026-02-27T17:22:54Z|00163|binding|INFO|Setting lport 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab ovn-installed in OVS
Feb 27 17:22:54 compute-0 ovn_controller[96756]: 2026-02-27T17:22:54Z|00164|binding|INFO|Setting lport 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab up in Southbound
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.297 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:54 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-0000000d.
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.304 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[ef56be7a-ceac-4455-bf7f-952bdc1c1432]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.306 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap69a995b1-f1 in ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.310 215632 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap69a995b1-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.310 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[01194bb6-8a5b-46c2-b2b3-95e7dfbaab31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.312 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[91c21a69-a4c8-4cf2-8916-2893082e4115]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.320 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[8b09d7fe-26c2-4b6b-96f2-bf1ec4ec6fb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.331 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[2278e76b-205f-4ff5-af80-cefd80f0dc77]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.362 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[bf82a68f-c9f4-4a90-bb01-6831c7cd36ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.370 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[f741bc95-92a0-4203-8359-b007a654b9bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 NetworkManager[56537]: <info>  [1772212974.3725] manager: (tap69a995b1-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/88)
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.397 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[7e99bd1d-44a2-40c0-bd1d-1e967b3b1c89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.400 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[aa07469a-7135-4563-833d-6d7660a61c77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 NetworkManager[56537]: <info>  [1772212974.4201] device (tap69a995b1-f0): carrier: link connected
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.423 215646 DEBUG oslo.privsep.daemon [-] privsep: reply[e233f459-5aac-4664-a353-4372853acff5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.439 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a32ed8-3ba6-4123-a6d5-74ca30dbee00]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69a995b1-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:48:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384991, 'reachable_time': 32462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221313, 'error': None, 'target': 'ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.453 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[a8ab0d86-8983-47af-af88-add42ec06665]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe68:48c2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 384991, 'tstamp': 384991}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221315, 'error': None, 'target': 'ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.469 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[06cf570e-048c-461d-ad40-cc818a96c2dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap69a995b1-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:68:48:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384991, 'reachable_time': 32462, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221316, 'error': None, 'target': 'ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.496 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[a1b68338-c46c-4e5c-98ee-2d8eb078d818]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.548 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[afbddc72-ee54-4d3c-9d21-f6aef7cf9198]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.549 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69a995b1-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.550 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.550 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69a995b1-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:54 compute-0 kernel: tap69a995b1-f0: entered promiscuous mode
Feb 27 17:22:54 compute-0 NetworkManager[56537]: <info>  [1772212974.5536] manager: (tap69a995b1-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.554 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.555 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap69a995b1-f0, col_values=(('external_ids', {'iface-id': '51084fc5-d92c-408a-a9bb-8379ff0f73a0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.557 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:54 compute-0 ovn_controller[96756]: 2026-02-27T17:22:54Z|00165|binding|INFO|Releasing lport 51084fc5-d92c-408a-a9bb-8379ff0f73a0 from this chassis (sb_readonly=0)
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.558 106085 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/69a995b1-fab3-4631-a4c9-73f23854e64d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/69a995b1-fab3-4631-a4c9-73f23854e64d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.559 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[aa8f62c0-9d40-4e43-a0e4-1ff5e9fcb1c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.560 106085 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: global
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     log         /dev/log local0 debug
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     log-tag     haproxy-metadata-proxy-69a995b1-fab3-4631-a4c9-73f23854e64d
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     user        root
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     group       root
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     maxconn     1024
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     pidfile     /var/lib/neutron/external/pids/69a995b1-fab3-4631-a4c9-73f23854e64d.pid.haproxy
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     daemon
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: defaults
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     log global
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     mode http
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     option httplog
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     option dontlognull
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     option http-server-close
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     option forwardfor
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     retries                 3
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     timeout http-request    30s
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     timeout connect         30s
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     timeout client          32s
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     timeout server          32s
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     timeout http-keep-alive 30s
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: listen listener
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     bind 169.254.169.254:80
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     server metadata /var/lib/neutron/metadata_proxy
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:     http-request add-header X-OVN-Network-ID 69a995b1-fab3-4631-a4c9-73f23854e64d
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.561 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:54 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:22:54.562 106085 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d', 'env', 'PROCESS_TAG=haproxy-69a995b1-fab3-4631-a4c9-73f23854e64d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/69a995b1-fab3-4631-a4c9-73f23854e64d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.664 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212974.6639264, 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.665 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] VM Started (Lifecycle Event)
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.773 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.778 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212974.6669197, 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.778 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] VM Paused (Lifecycle Event)
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.837 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.841 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.910 186844 DEBUG nova.network.neutron [req-5f426164-95dd-4a46-93f8-51bf63c12f6f req-a6dba021-88e5-4bc3-b229-2a3989f74200 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Updated VIF entry in instance network info cache for port 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.911 186844 DEBUG nova.network.neutron [req-5f426164-95dd-4a46-93f8-51bf63c12f6f req-a6dba021-88e5-4bc3-b229-2a3989f74200 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Updating instance_info_cache with network_info: [{"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.919 186844 DEBUG nova.compute.manager [req-f72481d3-165b-41fa-a025-1d2be0582a9e req-905aee3a-0b94-4905-821d-9e57c4dc34e1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Received event network-vif-plugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.919 186844 DEBUG oslo_concurrency.lockutils [req-f72481d3-165b-41fa-a025-1d2be0582a9e req-905aee3a-0b94-4905-821d-9e57c4dc34e1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.919 186844 DEBUG oslo_concurrency.lockutils [req-f72481d3-165b-41fa-a025-1d2be0582a9e req-905aee3a-0b94-4905-821d-9e57c4dc34e1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.919 186844 DEBUG oslo_concurrency.lockutils [req-f72481d3-165b-41fa-a025-1d2be0582a9e req-905aee3a-0b94-4905-821d-9e57c4dc34e1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.920 186844 DEBUG nova.compute.manager [req-f72481d3-165b-41fa-a025-1d2be0582a9e req-905aee3a-0b94-4905-821d-9e57c4dc34e1 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Processing event network-vif-plugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.920 186844 DEBUG nova.compute.manager [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.924 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.928 186844 INFO nova.virt.libvirt.driver [-] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Instance spawned successfully.
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.928 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.948 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.949 186844 DEBUG nova.virt.driver [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] Emitting event <LifecycleEvent: 1772212974.9239206, 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.949 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] VM Resumed (Lifecycle Event)
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.958 186844 DEBUG oslo_concurrency.lockutils [req-5f426164-95dd-4a46-93f8-51bf63c12f6f req-a6dba021-88e5-4bc3-b229-2a3989f74200 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:22:54 compute-0 podman[221355]: 2026-02-27 17:22:54.969465989 +0000 UTC m=+0.070724570 container create 348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223)
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.970 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.971 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.972 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.972 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.972 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:22:54 compute-0 nova_compute[186840]: 2026-02-27 17:22:54.973 186844 DEBUG nova.virt.libvirt.driver [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 27 17:22:55 compute-0 systemd[1]: Started libpod-conmon-348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822.scope.
Feb 27 17:22:55 compute-0 podman[221355]: 2026-02-27 17:22:54.932371522 +0000 UTC m=+0.033630133 image pull 2eca8c653984dc6e576f18f42e399ad6cc5a719b2d43d3fafd50f21f399639f3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 27 17:22:55 compute-0 systemd[1]: Started libcrun container.
Feb 27 17:22:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8e61472a0a06d0774b97504605ce8971172f8f801215d8a1c1b2fc13b8e1cf27/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 27 17:22:55 compute-0 nova_compute[186840]: 2026-02-27 17:22:55.067 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:22:55 compute-0 nova_compute[186840]: 2026-02-27 17:22:55.076 186844 DEBUG nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 27 17:22:55 compute-0 podman[221355]: 2026-02-27 17:22:55.082854984 +0000 UTC m=+0.184113655 container init 348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0)
Feb 27 17:22:55 compute-0 podman[221355]: 2026-02-27 17:22:55.091088258 +0000 UTC m=+0.192346859 container start 348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:22:55 compute-0 neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d[221370]: [NOTICE]   (221374) : New worker (221376) forked
Feb 27 17:22:55 compute-0 neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d[221370]: [NOTICE]   (221374) : Loading success.
Feb 27 17:22:55 compute-0 nova_compute[186840]: 2026-02-27 17:22:55.128 186844 INFO nova.compute.manager [None req-c3bf8d26-c1ce-426d-85ff-de0aa59c281d - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 27 17:22:55 compute-0 nova_compute[186840]: 2026-02-27 17:22:55.142 186844 INFO nova.compute.manager [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Took 10.43 seconds to spawn the instance on the hypervisor.
Feb 27 17:22:55 compute-0 nova_compute[186840]: 2026-02-27 17:22:55.143 186844 DEBUG nova.compute.manager [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:22:55 compute-0 nova_compute[186840]: 2026-02-27 17:22:55.229 186844 INFO nova.compute.manager [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Took 11.62 seconds to build instance.
Feb 27 17:22:55 compute-0 nova_compute[186840]: 2026-02-27 17:22:55.261 186844 DEBUG oslo_concurrency.lockutils [None req-84248c39-c538-44c1-a0c6-178049975f84 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:56 compute-0 nova_compute[186840]: 2026-02-27 17:22:56.193 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:57 compute-0 nova_compute[186840]: 2026-02-27 17:22:57.019 186844 DEBUG nova.compute.manager [req-7d55fb51-7f71-41b6-a2d1-7c33767cf041 req-23ac3f1f-08fc-41f1-a316-4645acadcc95 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Received event network-vif-plugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:57 compute-0 nova_compute[186840]: 2026-02-27 17:22:57.020 186844 DEBUG oslo_concurrency.lockutils [req-7d55fb51-7f71-41b6-a2d1-7c33767cf041 req-23ac3f1f-08fc-41f1-a316-4645acadcc95 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:22:57 compute-0 nova_compute[186840]: 2026-02-27 17:22:57.020 186844 DEBUG oslo_concurrency.lockutils [req-7d55fb51-7f71-41b6-a2d1-7c33767cf041 req-23ac3f1f-08fc-41f1-a316-4645acadcc95 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:22:57 compute-0 nova_compute[186840]: 2026-02-27 17:22:57.021 186844 DEBUG oslo_concurrency.lockutils [req-7d55fb51-7f71-41b6-a2d1-7c33767cf041 req-23ac3f1f-08fc-41f1-a316-4645acadcc95 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:22:57 compute-0 nova_compute[186840]: 2026-02-27 17:22:57.022 186844 DEBUG nova.compute.manager [req-7d55fb51-7f71-41b6-a2d1-7c33767cf041 req-23ac3f1f-08fc-41f1-a316-4645acadcc95 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] No waiting events found dispatching network-vif-plugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:22:57 compute-0 nova_compute[186840]: 2026-02-27 17:22:57.022 186844 WARNING nova.compute.manager [req-7d55fb51-7f71-41b6-a2d1-7c33767cf041 req-23ac3f1f-08fc-41f1-a316-4645acadcc95 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Received unexpected event network-vif-plugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab for instance with vm_state active and task_state None.
Feb 27 17:22:57 compute-0 nova_compute[186840]: 2026-02-27 17:22:57.815 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:58 compute-0 NetworkManager[56537]: <info>  [1772212978.5328] manager: (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Feb 27 17:22:58 compute-0 NetworkManager[56537]: <info>  [1772212978.5335] manager: (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Feb 27 17:22:58 compute-0 nova_compute[186840]: 2026-02-27 17:22:58.534 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:58 compute-0 nova_compute[186840]: 2026-02-27 17:22:58.539 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:58 compute-0 ovn_controller[96756]: 2026-02-27T17:22:58Z|00166|binding|INFO|Releasing lport 51084fc5-d92c-408a-a9bb-8379ff0f73a0 from this chassis (sb_readonly=0)
Feb 27 17:22:58 compute-0 nova_compute[186840]: 2026-02-27 17:22:58.545 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:22:59 compute-0 nova_compute[186840]: 2026-02-27 17:22:59.156 186844 DEBUG nova.compute.manager [req-ba53b6f5-fa4e-4a53-8175-6702964564f2 req-38519cda-63c6-4ef2-a19b-b374f81119db 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Received event network-changed-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:22:59 compute-0 nova_compute[186840]: 2026-02-27 17:22:59.158 186844 DEBUG nova.compute.manager [req-ba53b6f5-fa4e-4a53-8175-6702964564f2 req-38519cda-63c6-4ef2-a19b-b374f81119db 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Refreshing instance network info cache due to event network-changed-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:22:59 compute-0 nova_compute[186840]: 2026-02-27 17:22:59.159 186844 DEBUG oslo_concurrency.lockutils [req-ba53b6f5-fa4e-4a53-8175-6702964564f2 req-38519cda-63c6-4ef2-a19b-b374f81119db 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:22:59 compute-0 nova_compute[186840]: 2026-02-27 17:22:59.159 186844 DEBUG oslo_concurrency.lockutils [req-ba53b6f5-fa4e-4a53-8175-6702964564f2 req-38519cda-63c6-4ef2-a19b-b374f81119db 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:22:59 compute-0 nova_compute[186840]: 2026-02-27 17:22:59.160 186844 DEBUG nova.network.neutron [req-ba53b6f5-fa4e-4a53-8175-6702964564f2 req-38519cda-63c6-4ef2-a19b-b374f81119db 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Refreshing network info cache for port 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:23:00 compute-0 podman[221387]: 2026-02-27 17:23:00.673751971 +0000 UTC m=+0.072714470 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 27 17:23:01 compute-0 nova_compute[186840]: 2026-02-27 17:23:01.196 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:01 compute-0 nova_compute[186840]: 2026-02-27 17:23:01.994 186844 DEBUG nova.network.neutron [req-ba53b6f5-fa4e-4a53-8175-6702964564f2 req-38519cda-63c6-4ef2-a19b-b374f81119db 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Updated VIF entry in instance network info cache for port 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:23:01 compute-0 nova_compute[186840]: 2026-02-27 17:23:01.997 186844 DEBUG nova.network.neutron [req-ba53b6f5-fa4e-4a53-8175-6702964564f2 req-38519cda-63c6-4ef2-a19b-b374f81119db 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Updating instance_info_cache with network_info: [{"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:23:02 compute-0 nova_compute[186840]: 2026-02-27 17:23:02.088 186844 DEBUG oslo_concurrency.lockutils [req-ba53b6f5-fa4e-4a53-8175-6702964564f2 req-38519cda-63c6-4ef2-a19b-b374f81119db 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:23:02 compute-0 nova_compute[186840]: 2026-02-27 17:23:02.819 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:05 compute-0 podman[221424]: 2026-02-27 17:23:05.671025755 +0000 UTC m=+0.072953455 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 27 17:23:06 compute-0 nova_compute[186840]: 2026-02-27 17:23:06.244 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:07 compute-0 ovn_controller[96756]: 2026-02-27T17:23:07Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:f8:04 10.100.0.10
Feb 27 17:23:07 compute-0 ovn_controller[96756]: 2026-02-27T17:23:07Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:f8:04 10.100.0.10
Feb 27 17:23:07 compute-0 nova_compute[186840]: 2026-02-27 17:23:07.824 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:11 compute-0 nova_compute[186840]: 2026-02-27 17:23:11.278 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:12 compute-0 nova_compute[186840]: 2026-02-27 17:23:12.515 186844 INFO nova.compute.manager [None req-325ec65b-3d05-4ae0-ab0f-c24e90ae30ba 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Get console output
Feb 27 17:23:12 compute-0 nova_compute[186840]: 2026-02-27 17:23:12.524 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:23:12 compute-0 nova_compute[186840]: 2026-02-27 17:23:12.827 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:13 compute-0 ovn_controller[96756]: 2026-02-27T17:23:13Z|00167|binding|INFO|Releasing lport 51084fc5-d92c-408a-a9bb-8379ff0f73a0 from this chassis (sb_readonly=0)
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.313 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:13 compute-0 ovn_controller[96756]: 2026-02-27T17:23:13Z|00168|binding|INFO|Releasing lport 51084fc5-d92c-408a-a9bb-8379ff0f73a0 from this chassis (sb_readonly=0)
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.339 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.736 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.737 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.738 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.738 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.824 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.905 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.906 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:23:13 compute-0 nova_compute[186840]: 2026-02-27 17:23:13.984 186844 DEBUG oslo_concurrency.processutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1/disk --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.175 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.176 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5578MB free_disk=73.16197204589844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.177 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.177 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.294 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Instance 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.294 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.295 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.566 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.612 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.676 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.676 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.499s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.681 186844 INFO nova.compute.manager [None req-c49d8620-05b3-4eb2-8050-b244cd99e511 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Get console output
Feb 27 17:23:14 compute-0 nova_compute[186840]: 2026-02-27 17:23:14.688 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:23:15 compute-0 ovn_controller[96756]: 2026-02-27T17:23:15Z|00169|binding|INFO|Releasing lport 51084fc5-d92c-408a-a9bb-8379ff0f73a0 from this chassis (sb_readonly=0)
Feb 27 17:23:15 compute-0 nova_compute[186840]: 2026-02-27 17:23:15.439 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:15 compute-0 NetworkManager[56537]: <info>  [1772212995.4400] manager: (patch-br-int-to-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Feb 27 17:23:15 compute-0 NetworkManager[56537]: <info>  [1772212995.4412] manager: (patch-provnet-89a1eb2a-26a7-477e-8851-880730dd9ee5-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Feb 27 17:23:15 compute-0 ovn_controller[96756]: 2026-02-27T17:23:15Z|00170|binding|INFO|Releasing lport 51084fc5-d92c-408a-a9bb-8379ff0f73a0 from this chassis (sb_readonly=0)
Feb 27 17:23:15 compute-0 nova_compute[186840]: 2026-02-27 17:23:15.453 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:15 compute-0 nova_compute[186840]: 2026-02-27 17:23:15.790 186844 INFO nova.compute.manager [None req-72802b33-434b-43c7-a870-7172146f5b6c 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Get console output
Feb 27 17:23:15 compute-0 nova_compute[186840]: 2026-02-27 17:23:15.799 215513 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.280 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.678 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.801 186844 DEBUG nova.compute.manager [req-27a2e4f2-b778-45d2-a96d-019f887acdc4 req-b3b7eb85-2c74-4624-856d-5d6583b3e075 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Received event network-changed-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.802 186844 DEBUG nova.compute.manager [req-27a2e4f2-b778-45d2-a96d-019f887acdc4 req-b3b7eb85-2c74-4624-856d-5d6583b3e075 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Refreshing instance network info cache due to event network-changed-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.802 186844 DEBUG oslo_concurrency.lockutils [req-27a2e4f2-b778-45d2-a96d-019f887acdc4 req-b3b7eb85-2c74-4624-856d-5d6583b3e075 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.803 186844 DEBUG oslo_concurrency.lockutils [req-27a2e4f2-b778-45d2-a96d-019f887acdc4 req-b3b7eb85-2c74-4624-856d-5d6583b3e075 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquired lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.803 186844 DEBUG nova.network.neutron [req-27a2e4f2-b778-45d2-a96d-019f887acdc4 req-b3b7eb85-2c74-4624-856d-5d6583b3e075 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Refreshing network info cache for port 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.927 186844 DEBUG oslo_concurrency.lockutils [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.928 186844 DEBUG oslo_concurrency.lockutils [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.928 186844 DEBUG oslo_concurrency.lockutils [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.929 186844 DEBUG oslo_concurrency.lockutils [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.929 186844 DEBUG oslo_concurrency.lockutils [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.931 186844 INFO nova.compute.manager [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Terminating instance
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.932 186844 DEBUG nova.compute.manager [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 27 17:23:16 compute-0 kernel: tap1044f5b2-3e (unregistering): left promiscuous mode
Feb 27 17:23:16 compute-0 NetworkManager[56537]: <info>  [1772212996.9543] device (tap1044f5b2-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Feb 27 17:23:16 compute-0 ovn_controller[96756]: 2026-02-27T17:23:16Z|00171|binding|INFO|Releasing lport 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab from this chassis (sb_readonly=0)
Feb 27 17:23:16 compute-0 ovn_controller[96756]: 2026-02-27T17:23:16Z|00172|binding|INFO|Setting lport 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab down in Southbound
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.963 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:16 compute-0 ovn_controller[96756]: 2026-02-27T17:23:16Z|00173|binding|INFO|Removing iface tap1044f5b2-3e ovn-installed in OVS
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.967 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:16 compute-0 nova_compute[186840]: 2026-02-27 17:23:16.979 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:16 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Feb 27 17:23:16 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000000d.scope: Consumed 12.213s CPU time.
Feb 27 17:23:16 compute-0 systemd-machined[156136]: Machine qemu-13-instance-0000000d terminated.
Feb 27 17:23:16 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:16.993 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:f8:04 10.100.0.10'], port_security=['fa:16:3e:7b:f8:04 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-69a995b1-fab3-4631-a4c9-73f23854e64d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0922444e0aaf445884a7c2fa20793b1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a4e352ed-bf10-46cf-a75f-cabcd828d88c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed5c5633-2663-423a-8cd1-8fc77b585b41, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>], logical_port=1044f5b2-3edd-4bb1-9c19-99ebe55c99ab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7efc24294ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:23:16 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:16.995 106085 INFO neutron.agent.ovn.metadata.agent [-] Port 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab in datapath 69a995b1-fab3-4631-a4c9-73f23854e64d unbound from our chassis
Feb 27 17:23:16 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:16.997 106085 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 69a995b1-fab3-4631-a4c9-73f23854e64d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 27 17:23:16 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:16.998 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[c18f7bc6-1ee6-4555-ac5c-7b44ace621d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:23:16 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:16.999 106085 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d namespace which is not needed anymore
Feb 27 17:23:17 compute-0 podman[221462]: 2026-02-27 17:23:17.067314697 +0000 UTC m=+0.083497157 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:23:17 compute-0 podman[221459]: 2026-02-27 17:23:17.090139351 +0000 UTC m=+0.107563941 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 27 17:23:17 compute-0 neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d[221370]: [NOTICE]   (221374) : haproxy version is 2.8.14-c23fe91
Feb 27 17:23:17 compute-0 neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d[221370]: [NOTICE]   (221374) : path to executable is /usr/sbin/haproxy
Feb 27 17:23:17 compute-0 neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d[221370]: [ALERT]    (221374) : Current worker (221376) exited with code 143 (Terminated)
Feb 27 17:23:17 compute-0 neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d[221370]: [WARNING]  (221374) : All workers exited. Exiting... (0)
Feb 27 17:23:17 compute-0 systemd[1]: libpod-348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822.scope: Deactivated successfully.
Feb 27 17:23:17 compute-0 podman[221526]: 2026-02-27 17:23:17.177675967 +0000 UTC m=+0.063089522 container died 348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.195 186844 INFO nova.virt.libvirt.driver [-] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Instance destroyed successfully.
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.195 186844 DEBUG nova.objects.instance [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lazy-loading 'resources' on Instance uuid 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 27 17:23:17 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822-userdata-shm.mount: Deactivated successfully.
Feb 27 17:23:17 compute-0 systemd[1]: var-lib-containers-storage-overlay-8e61472a0a06d0774b97504605ce8971172f8f801215d8a1c1b2fc13b8e1cf27-merged.mount: Deactivated successfully.
Feb 27 17:23:17 compute-0 podman[221526]: 2026-02-27 17:23:17.225962471 +0000 UTC m=+0.111376036 container cleanup 348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 27 17:23:17 compute-0 systemd[1]: libpod-conmon-348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822.scope: Deactivated successfully.
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.281 186844 DEBUG nova.virt.libvirt.vif [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-27T17:22:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1111984263',display_name='tempest-TestNetworkBasicOps-server-1111984263',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1111984263',id=13,image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCW+Vf+oKAa64TcEAgR3HI/FwInYpuSVHJb+s+I6+u9Y/HVVI0fBCHHm9GR9LBKuAFGhZ5mcvJZZVxW2QQy1WRlFtIxpi9P82osGwr/hCljitYSHR4iscYB/oMQOoJ9ssg==',key_name='tempest-TestNetworkBasicOps-53899880',keypairs=<?>,launch_index=0,launched_at=2026-02-27T17:22:55Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0922444e0aaf445884a7c2fa20793b1f',ramdisk_id='',reservation_id='r-k7ja1d76',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b49463d5-90a4-4c27-9dac-a140f152eabc',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1859516505',owner_user_name='tempest-TestNetworkBasicOps-1859516505-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-27T17:22:55Z,user_data=None,user_id='427d6e526715473ebe8997007bbff5cd',uuid=8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.281 186844 DEBUG nova.network.os_vif_util [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converting VIF {"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.224", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.283 186844 DEBUG nova.network.os_vif_util [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:f8:04,bridge_name='br-int',has_traffic_filtering=True,id=1044f5b2-3edd-4bb1-9c19-99ebe55c99ab,network=Network(69a995b1-fab3-4631-a4c9-73f23854e64d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1044f5b2-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.283 186844 DEBUG os_vif [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:f8:04,bridge_name='br-int',has_traffic_filtering=True,id=1044f5b2-3edd-4bb1-9c19-99ebe55c99ab,network=Network(69a995b1-fab3-4631-a4c9-73f23854e64d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1044f5b2-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.285 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.286 186844 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1044f5b2-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.289 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.290 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.294 186844 INFO os_vif [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:f8:04,bridge_name='br-int',has_traffic_filtering=True,id=1044f5b2-3edd-4bb1-9c19-99ebe55c99ab,network=Network(69a995b1-fab3-4631-a4c9-73f23854e64d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1044f5b2-3e')
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.295 186844 INFO nova.virt.libvirt.driver [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Deleting instance files /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1_del
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.296 186844 INFO nova.virt.libvirt.driver [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Deletion of /var/lib/nova/instances/8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1_del complete
Feb 27 17:23:17 compute-0 podman[221573]: 2026-02-27 17:23:17.305700093 +0000 UTC m=+0.057362989 container remove 348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 27 17:23:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:17.310 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1d5c8b-6af4-4a7f-a36c-0d614f9192cb]: (4, ('Fri Feb 27 05:23:17 PM UTC 2026 Stopping container neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d (348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822)\n348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822\nFri Feb 27 05:23:17 PM UTC 2026 Deleting container neutron-haproxy-ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d (348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822)\n348c02d177b77d2c75970aff44b214c54795cdf73f8c17ed415cb9bdd5bb4822\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:23:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:17.312 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[7b874efb-c9b6-44bd-90d3-b1131c234f43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:23:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:17.313 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69a995b1-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:23:17 compute-0 kernel: tap69a995b1-f0: left promiscuous mode
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.315 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.325 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:17.329 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[0586148b-d168-4182-82f4-a7c41c25a28d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:23:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:17.347 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[05616e09-4011-40b7-83f6-a3b340695462]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:23:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:17.348 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[aecc546c-7052-44ce-9401-26434452d1f6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:23:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:17.366 215632 DEBUG oslo.privsep.daemon [-] privsep: reply[22720c62-2b9c-48fb-a70c-aa6648a4294f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 384984, 'reachable_time': 40219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221588, 'error': None, 'target': 'ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:23:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:17.369 106512 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-69a995b1-fab3-4631-a4c9-73f23854e64d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 27 17:23:17 compute-0 systemd[1]: run-netns-ovnmeta\x2d69a995b1\x2dfab3\x2d4631\x2da4c9\x2d73f23854e64d.mount: Deactivated successfully.
Feb 27 17:23:17 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:17.369 106512 DEBUG oslo.privsep.daemon [-] privsep: reply[e092dfff-a4ae-4f3d-b6fd-dcb1fbc3b01f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.422 186844 INFO nova.compute.manager [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Took 0.49 seconds to destroy the instance on the hypervisor.
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.423 186844 DEBUG oslo.service.loopingcall [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.423 186844 DEBUG nova.compute.manager [-] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.424 186844 DEBUG nova.network.neutron [-] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.697 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.698 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.698 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.834 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.835 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:23:17 compute-0 nova_compute[186840]: 2026-02-27 17:23:17.835 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:23:18 compute-0 nova_compute[186840]: 2026-02-27 17:23:18.768 186844 DEBUG nova.network.neutron [-] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:23:18 compute-0 nova_compute[186840]: 2026-02-27 17:23:18.862 186844 INFO nova.compute.manager [-] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Took 1.44 seconds to deallocate network for instance.
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.032 186844 DEBUG oslo_concurrency.lockutils [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.033 186844 DEBUG oslo_concurrency.lockutils [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.039 186844 DEBUG nova.compute.manager [req-820688f5-0776-46d1-88db-aeba60cadbd4 req-147bc481-6972-4c65-a468-5b55864aa5e3 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Received event network-vif-deleted-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.057 186844 DEBUG nova.scheduler.client.report [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Refreshing inventories for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.083 186844 DEBUG nova.scheduler.client.report [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Updating ProviderTree inventory for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.083 186844 DEBUG nova.compute.provider_tree [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Updating inventory in ProviderTree for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.089 186844 DEBUG nova.compute.manager [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Received event network-vif-unplugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.090 186844 DEBUG oslo_concurrency.lockutils [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.090 186844 DEBUG oslo_concurrency.lockutils [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.090 186844 DEBUG oslo_concurrency.lockutils [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.091 186844 DEBUG nova.compute.manager [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] No waiting events found dispatching network-vif-unplugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.091 186844 WARNING nova.compute.manager [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Received unexpected event network-vif-unplugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab for instance with vm_state deleted and task_state None.
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.092 186844 DEBUG nova.compute.manager [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Received event network-vif-plugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.092 186844 DEBUG oslo_concurrency.lockutils [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Acquiring lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.092 186844 DEBUG oslo_concurrency.lockutils [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.093 186844 DEBUG oslo_concurrency.lockutils [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.093 186844 DEBUG nova.compute.manager [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] No waiting events found dispatching network-vif-plugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.094 186844 WARNING nova.compute.manager [req-79c75833-718b-4a3c-8cfa-7b61d2a923f4 req-4812c5c3-9730-456d-b58f-0140e658395e 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Received unexpected event network-vif-plugged-1044f5b2-3edd-4bb1-9c19-99ebe55c99ab for instance with vm_state deleted and task_state None.
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.098 186844 DEBUG nova.scheduler.client.report [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Refreshing aggregate associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.129 186844 DEBUG nova.scheduler.client.report [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Refreshing trait associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, traits: HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AESNI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_BMI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.169 186844 DEBUG nova.compute.provider_tree [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.197 186844 DEBUG nova.scheduler.client.report [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.237 186844 DEBUG oslo_concurrency.lockutils [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.204s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.280 186844 INFO nova.scheduler.client.report [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Deleted allocations for instance 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.365 186844 DEBUG oslo_concurrency.lockutils [None req-0d6142f3-26b4-462f-9806-070312ce96df 427d6e526715473ebe8997007bbff5cd 0922444e0aaf445884a7c2fa20793b1f - - default default] Lock "8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.438s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.534 186844 DEBUG nova.network.neutron [req-27a2e4f2-b778-45d2-a96d-019f887acdc4 req-b3b7eb85-2c74-4624-856d-5d6583b3e075 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Updated VIF entry in instance network info cache for port 1044f5b2-3edd-4bb1-9c19-99ebe55c99ab. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.535 186844 DEBUG nova.network.neutron [req-27a2e4f2-b778-45d2-a96d-019f887acdc4 req-b3b7eb85-2c74-4624-856d-5d6583b3e075 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Updating instance_info_cache with network_info: [{"id": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "address": "fa:16:3e:7b:f8:04", "network": {"id": "69a995b1-fab3-4631-a4c9-73f23854e64d", "bridge": "br-int", "label": "tempest-network-smoke--405001117", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0922444e0aaf445884a7c2fa20793b1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1044f5b2-3e", "ovs_interfaceid": "1044f5b2-3edd-4bb1-9c19-99ebe55c99ab", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.674 186844 DEBUG oslo_concurrency.lockutils [req-27a2e4f2-b778-45d2-a96d-019f887acdc4 req-b3b7eb85-2c74-4624-856d-5d6583b3e075 4b2e5116287a4d7d9cdb38abaa4536d3 6b068d37c19947fea5c6240baf2f4d80 - - default default] Releasing lock "refresh_cache-8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 27 17:23:19 compute-0 nova_compute[186840]: 2026-02-27 17:23:19.832 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:23:21 compute-0 nova_compute[186840]: 2026-02-27 17:23:21.282 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:21 compute-0 nova_compute[186840]: 2026-02-27 17:23:21.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:23:22 compute-0 nova_compute[186840]: 2026-02-27 17:23:22.289 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:22 compute-0 podman[221589]: 2026-02-27 17:23:22.664683154 +0000 UTC m=+0.069980216 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, release=1770267347, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=)
Feb 27 17:23:22 compute-0 podman[221590]: 2026-02-27 17:23:22.68639354 +0000 UTC m=+0.092234257 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 27 17:23:22 compute-0 nova_compute[186840]: 2026-02-27 17:23:22.694 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:23:23 compute-0 nova_compute[186840]: 2026-02-27 17:23:23.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:23:24 compute-0 nova_compute[186840]: 2026-02-27 17:23:24.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:23:24 compute-0 nova_compute[186840]: 2026-02-27 17:23:24.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:23:25 compute-0 nova_compute[186840]: 2026-02-27 17:23:25.169 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:25 compute-0 nova_compute[186840]: 2026-02-27 17:23:25.203 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:26 compute-0 nova_compute[186840]: 2026-02-27 17:23:26.308 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:27 compute-0 nova_compute[186840]: 2026-02-27 17:23:27.292 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:31 compute-0 nova_compute[186840]: 2026-02-27 17:23:31.360 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:31 compute-0 podman[221635]: 2026-02-27 17:23:31.676193527 +0000 UTC m=+0.078909796 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 27 17:23:32 compute-0 nova_compute[186840]: 2026-02-27 17:23:32.192 186844 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1772212997.1906989, 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 27 17:23:32 compute-0 nova_compute[186840]: 2026-02-27 17:23:32.194 186844 INFO nova.compute.manager [-] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] VM Stopped (Lifecycle Event)
Feb 27 17:23:32 compute-0 nova_compute[186840]: 2026-02-27 17:23:32.252 186844 DEBUG nova.compute.manager [None req-f15bf6f5-63b8-4512-85b9-c0ba5e90a747 - - - - - -] [instance: 8ea79f5e-3bb9-4a9c-96a4-e9ed438921e1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 27 17:23:32 compute-0 nova_compute[186840]: 2026-02-27 17:23:32.294 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:36 compute-0 nova_compute[186840]: 2026-02-27 17:23:36.361 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:36 compute-0 podman[221655]: 2026-02-27 17:23:36.667890704 +0000 UTC m=+0.071152529 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 27 17:23:37 compute-0 nova_compute[186840]: 2026-02-27 17:23:37.297 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:41 compute-0 nova_compute[186840]: 2026-02-27 17:23:41.400 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:42 compute-0 nova_compute[186840]: 2026-02-27 17:23:42.300 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:46 compute-0 nova_compute[186840]: 2026-02-27 17:23:46.429 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:47.103 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:23:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:47.104 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:23:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:23:47.105 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:23:47 compute-0 nova_compute[186840]: 2026-02-27 17:23:47.303 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:47 compute-0 podman[221681]: 2026-02-27 17:23:47.678370247 +0000 UTC m=+0.068999609 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:23:47 compute-0 podman[221682]: 2026-02-27 17:23:47.681255167 +0000 UTC m=+0.064176743 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 27 17:23:51 compute-0 nova_compute[186840]: 2026-02-27 17:23:51.481 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:52 compute-0 nova_compute[186840]: 2026-02-27 17:23:52.305 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:53 compute-0 podman[221726]: 2026-02-27 17:23:53.66629575 +0000 UTC m=+0.069537174 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, version=9.7, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 27 17:23:53 compute-0 podman[221727]: 2026-02-27 17:23:53.682142962 +0000 UTC m=+0.083020730 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:23:55 compute-0 ovn_controller[96756]: 2026-02-27T17:23:55Z|00174|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 27 17:23:56 compute-0 nova_compute[186840]: 2026-02-27 17:23:56.522 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:23:57 compute-0 nova_compute[186840]: 2026-02-27 17:23:57.307 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:01 compute-0 nova_compute[186840]: 2026-02-27 17:24:01.524 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:02 compute-0 nova_compute[186840]: 2026-02-27 17:24:02.309 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:02 compute-0 podman[221775]: 2026-02-27 17:24:02.662340743 +0000 UTC m=+0.070178702 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, io.buildah.version=1.43.0)
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:24:05.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:24:06 compute-0 nova_compute[186840]: 2026-02-27 17:24:06.526 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:07 compute-0 nova_compute[186840]: 2026-02-27 17:24:07.312 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:07 compute-0 podman[221795]: 2026-02-27 17:24:07.662072465 +0000 UTC m=+0.065151051 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:24:11 compute-0 nova_compute[186840]: 2026-02-27 17:24:11.545 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:12 compute-0 nova_compute[186840]: 2026-02-27 17:24:12.315 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:14 compute-0 nova_compute[186840]: 2026-02-27 17:24:14.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:24:14 compute-0 nova_compute[186840]: 2026-02-27 17:24:14.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:24:14 compute-0 nova_compute[186840]: 2026-02-27 17:24:14.760 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:24:14 compute-0 nova_compute[186840]: 2026-02-27 17:24:14.760 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:24:14 compute-0 nova_compute[186840]: 2026-02-27 17:24:14.761 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:24:14 compute-0 nova_compute[186840]: 2026-02-27 17:24:14.761 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:24:14 compute-0 nova_compute[186840]: 2026-02-27 17:24:14.991 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:24:14 compute-0 nova_compute[186840]: 2026-02-27 17:24:14.994 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5771MB free_disk=73.19058227539062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:24:14 compute-0 nova_compute[186840]: 2026-02-27 17:24:14.994 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:24:14 compute-0 nova_compute[186840]: 2026-02-27 17:24:14.995 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:24:15 compute-0 nova_compute[186840]: 2026-02-27 17:24:15.114 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:24:15 compute-0 nova_compute[186840]: 2026-02-27 17:24:15.115 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:24:15 compute-0 nova_compute[186840]: 2026-02-27 17:24:15.139 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:24:15 compute-0 nova_compute[186840]: 2026-02-27 17:24:15.159 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:24:15 compute-0 nova_compute[186840]: 2026-02-27 17:24:15.194 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:24:15 compute-0 nova_compute[186840]: 2026-02-27 17:24:15.194 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.200s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:24:16 compute-0 nova_compute[186840]: 2026-02-27 17:24:16.196 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:24:16 compute-0 nova_compute[186840]: 2026-02-27 17:24:16.589 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:17 compute-0 nova_compute[186840]: 2026-02-27 17:24:17.319 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:17 compute-0 nova_compute[186840]: 2026-02-27 17:24:17.701 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:24:17 compute-0 nova_compute[186840]: 2026-02-27 17:24:17.702 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:24:17 compute-0 nova_compute[186840]: 2026-02-27 17:24:17.703 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:24:17 compute-0 nova_compute[186840]: 2026-02-27 17:24:17.763 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:24:17 compute-0 nova_compute[186840]: 2026-02-27 17:24:17.763 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:24:18 compute-0 podman[221821]: 2026-02-27 17:24:18.691354642 +0000 UTC m=+0.082500146 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 27 17:24:18 compute-0 podman[221820]: 2026-02-27 17:24:18.691528567 +0000 UTC m=+0.087694731 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 27 17:24:20 compute-0 nova_compute[186840]: 2026-02-27 17:24:20.756 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:24:21 compute-0 nova_compute[186840]: 2026-02-27 17:24:21.592 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:21 compute-0 nova_compute[186840]: 2026-02-27 17:24:21.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:24:22 compute-0 nova_compute[186840]: 2026-02-27 17:24:22.321 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:24 compute-0 podman[221861]: 2026-02-27 17:24:24.670781451 +0000 UTC m=+0.075733947 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., version=9.7)
Feb 27 17:24:24 compute-0 nova_compute[186840]: 2026-02-27 17:24:24.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:24:24 compute-0 podman[221862]: 2026-02-27 17:24:24.710763248 +0000 UTC m=+0.110924990 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 27 17:24:26 compute-0 nova_compute[186840]: 2026-02-27 17:24:26.622 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:26 compute-0 nova_compute[186840]: 2026-02-27 17:24:26.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:24:26 compute-0 nova_compute[186840]: 2026-02-27 17:24:26.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:24:27 compute-0 nova_compute[186840]: 2026-02-27 17:24:27.324 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:28 compute-0 sshd[131044]: exited MaxStartups throttling after 00:05:35, 3 connections dropped
Feb 27 17:24:28 compute-0 sshd-session[221907]: Accepted publickey for zuul from 192.168.122.10 port 45786 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 17:24:28 compute-0 systemd-logind[803]: New session 26 of user zuul.
Feb 27 17:24:28 compute-0 systemd[1]: Started Session 26 of User zuul.
Feb 27 17:24:28 compute-0 sshd-session[221907]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 17:24:28 compute-0 sudo[221911]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 27 17:24:28 compute-0 sudo[221911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:24:31 compute-0 nova_compute[186840]: 2026-02-27 17:24:31.651 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:32 compute-0 nova_compute[186840]: 2026-02-27 17:24:32.327 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:33 compute-0 podman[222093]: 2026-02-27 17:24:33.711432279 +0000 UTC m=+0.107850014 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260223, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 27 17:24:34 compute-0 ovs-vsctl[222142]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 27 17:24:35 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 221935 (sos)
Feb 27 17:24:35 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 27 17:24:35 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 27 17:24:35 compute-0 virtqemud[186011]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 27 17:24:36 compute-0 virtqemud[186011]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 27 17:24:36 compute-0 virtqemud[186011]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 27 17:24:36 compute-0 nova_compute[186840]: 2026-02-27 17:24:36.652 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:37 compute-0 crontab[222557]: (root) LIST (root)
Feb 27 17:24:37 compute-0 nova_compute[186840]: 2026-02-27 17:24:37.329 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:38 compute-0 kernel: /proc/cgroups lists only v1 controllers, use cgroup.controllers of root cgroup for v2 info
Feb 27 17:24:38 compute-0 podman[222631]: 2026-02-27 17:24:38.674900208 +0000 UTC m=+0.073756361 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:24:38 compute-0 systemd[1]: Starting Hostname Service...
Feb 27 17:24:38 compute-0 systemd[1]: Started Hostname Service.
Feb 27 17:24:41 compute-0 nova_compute[186840]: 2026-02-27 17:24:41.653 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:42 compute-0 nova_compute[186840]: 2026-02-27 17:24:42.332 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:46 compute-0 nova_compute[186840]: 2026-02-27 17:24:46.655 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:46 compute-0 ovs-appctl[223779]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 27 17:24:46 compute-0 ovs-appctl[223785]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 27 17:24:46 compute-0 ovs-appctl[223789]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Feb 27 17:24:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:24:47.104 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:24:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:24:47.105 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:24:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:24:47.105 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:24:47 compute-0 nova_compute[186840]: 2026-02-27 17:24:47.334 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:51 compute-0 podman[224679]: 2026-02-27 17:24:51.218952219 +0000 UTC m=+0.063010401 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:24:51 compute-0 podman[224676]: 2026-02-27 17:24:51.248158935 +0000 UTC m=+0.090468358 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:24:51 compute-0 nova_compute[186840]: 2026-02-27 17:24:51.657 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:52 compute-0 nova_compute[186840]: 2026-02-27 17:24:52.339 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:54 compute-0 podman[224935]: 2026-02-27 17:24:54.802070378 +0000 UTC m=+0.096413614 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1770267347, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal)
Feb 27 17:24:54 compute-0 podman[224971]: 2026-02-27 17:24:54.833895387 +0000 UTC m=+0.079811610 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 27 17:24:55 compute-0 virtqemud[186011]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 27 17:24:56 compute-0 systemd[1]: Starting Time & Date Service...
Feb 27 17:24:56 compute-0 systemd[1]: Started Time & Date Service.
Feb 27 17:24:56 compute-0 nova_compute[186840]: 2026-02-27 17:24:56.697 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:24:57 compute-0 nova_compute[186840]: 2026-02-27 17:24:57.346 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:01 compute-0 nova_compute[186840]: 2026-02-27 17:25:01.744 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:02 compute-0 nova_compute[186840]: 2026-02-27 17:25:02.351 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:04 compute-0 podman[225398]: 2026-02-27 17:25:04.687166095 +0000 UTC m=+0.086590899 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 27 17:25:06 compute-0 nova_compute[186840]: 2026-02-27 17:25:06.747 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:07 compute-0 nova_compute[186840]: 2026-02-27 17:25:07.384 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:09 compute-0 podman[225419]: 2026-02-27 17:25:09.662920666 +0000 UTC m=+0.065342326 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:25:11 compute-0 nova_compute[186840]: 2026-02-27 17:25:11.751 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:12 compute-0 nova_compute[186840]: 2026-02-27 17:25:12.386 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:14 compute-0 sudo[221911]: pam_unix(sudo:session): session closed for user root
Feb 27 17:25:14 compute-0 sshd-session[221910]: Received disconnect from 192.168.122.10 port 45786:11: disconnected by user
Feb 27 17:25:14 compute-0 sshd-session[221910]: Disconnected from user zuul 192.168.122.10 port 45786
Feb 27 17:25:14 compute-0 sshd-session[221907]: pam_unix(sshd:session): session closed for user zuul
Feb 27 17:25:14 compute-0 systemd-logind[803]: Session 26 logged out. Waiting for processes to exit.
Feb 27 17:25:14 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Feb 27 17:25:14 compute-0 systemd[1]: session-26.scope: Consumed 1min 15.585s CPU time, 632.6M memory peak, read 246.2M from disk, written 41.0M to disk.
Feb 27 17:25:14 compute-0 systemd-logind[803]: Removed session 26.
Feb 27 17:25:14 compute-0 sshd-session[225443]: Accepted publickey for zuul from 192.168.122.10 port 41892 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 17:25:14 compute-0 systemd-logind[803]: New session 27 of user zuul.
Feb 27 17:25:14 compute-0 systemd[1]: Started Session 27 of User zuul.
Feb 27 17:25:14 compute-0 sshd-session[225443]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 17:25:15 compute-0 sudo[225447]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2026-02-27-colazdv.tar.xz
Feb 27 17:25:15 compute-0 sudo[225447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:25:15 compute-0 sudo[225447]: pam_unix(sudo:session): session closed for user root
Feb 27 17:25:15 compute-0 sshd-session[225446]: Received disconnect from 192.168.122.10 port 41892:11: disconnected by user
Feb 27 17:25:15 compute-0 sshd-session[225446]: Disconnected from user zuul 192.168.122.10 port 41892
Feb 27 17:25:15 compute-0 sshd-session[225443]: pam_unix(sshd:session): session closed for user zuul
Feb 27 17:25:15 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Feb 27 17:25:15 compute-0 systemd-logind[803]: Session 27 logged out. Waiting for processes to exit.
Feb 27 17:25:15 compute-0 systemd-logind[803]: Removed session 27.
Feb 27 17:25:15 compute-0 sshd-session[225472]: Accepted publickey for zuul from 192.168.122.10 port 41902 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 17:25:15 compute-0 systemd-logind[803]: New session 28 of user zuul.
Feb 27 17:25:15 compute-0 systemd[1]: Started Session 28 of User zuul.
Feb 27 17:25:15 compute-0 sshd-session[225472]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 17:25:15 compute-0 sudo[225476]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Feb 27 17:25:15 compute-0 sudo[225476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:25:15 compute-0 sudo[225476]: pam_unix(sudo:session): session closed for user root
Feb 27 17:25:15 compute-0 sshd-session[225475]: Received disconnect from 192.168.122.10 port 41902:11: disconnected by user
Feb 27 17:25:15 compute-0 sshd-session[225475]: Disconnected from user zuul 192.168.122.10 port 41902
Feb 27 17:25:15 compute-0 sshd-session[225472]: pam_unix(sshd:session): session closed for user zuul
Feb 27 17:25:15 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Feb 27 17:25:15 compute-0 systemd-logind[803]: Session 28 logged out. Waiting for processes to exit.
Feb 27 17:25:15 compute-0 systemd-logind[803]: Removed session 28.
Feb 27 17:25:15 compute-0 nova_compute[186840]: 2026-02-27 17:25:15.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:25:15 compute-0 nova_compute[186840]: 2026-02-27 17:25:15.729 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:25:15 compute-0 nova_compute[186840]: 2026-02-27 17:25:15.730 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:25:15 compute-0 nova_compute[186840]: 2026-02-27 17:25:15.730 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:25:15 compute-0 nova_compute[186840]: 2026-02-27 17:25:15.731 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:25:15 compute-0 nova_compute[186840]: 2026-02-27 17:25:15.930 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:25:15 compute-0 nova_compute[186840]: 2026-02-27 17:25:15.931 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5560MB free_disk=73.1898422241211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:25:15 compute-0 nova_compute[186840]: 2026-02-27 17:25:15.931 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:25:15 compute-0 nova_compute[186840]: 2026-02-27 17:25:15.932 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:25:16 compute-0 nova_compute[186840]: 2026-02-27 17:25:16.023 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:25:16 compute-0 nova_compute[186840]: 2026-02-27 17:25:16.023 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:25:16 compute-0 nova_compute[186840]: 2026-02-27 17:25:16.057 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:25:16 compute-0 nova_compute[186840]: 2026-02-27 17:25:16.083 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:25:16 compute-0 nova_compute[186840]: 2026-02-27 17:25:16.085 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:25:16 compute-0 nova_compute[186840]: 2026-02-27 17:25:16.086 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:25:16 compute-0 nova_compute[186840]: 2026-02-27 17:25:16.755 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:17 compute-0 nova_compute[186840]: 2026-02-27 17:25:17.085 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:25:17 compute-0 nova_compute[186840]: 2026-02-27 17:25:17.086 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:25:17 compute-0 nova_compute[186840]: 2026-02-27 17:25:17.388 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:17 compute-0 nova_compute[186840]: 2026-02-27 17:25:17.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:25:19 compute-0 nova_compute[186840]: 2026-02-27 17:25:19.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:25:19 compute-0 nova_compute[186840]: 2026-02-27 17:25:19.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:25:19 compute-0 nova_compute[186840]: 2026-02-27 17:25:19.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:25:19 compute-0 nova_compute[186840]: 2026-02-27 17:25:19.736 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:25:21 compute-0 podman[225501]: 2026-02-27 17:25:21.683607462 +0000 UTC m=+0.080645014 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:25:21 compute-0 podman[225502]: 2026-02-27 17:25:21.683594972 +0000 UTC m=+0.080560563 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 27 17:25:21 compute-0 nova_compute[186840]: 2026-02-27 17:25:21.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:25:21 compute-0 nova_compute[186840]: 2026-02-27 17:25:21.804 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:22 compute-0 nova_compute[186840]: 2026-02-27 17:25:22.390 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:22 compute-0 nova_compute[186840]: 2026-02-27 17:25:22.694 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:25:23 compute-0 nova_compute[186840]: 2026-02-27 17:25:23.694 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:25:25 compute-0 podman[225545]: 2026-02-27 17:25:25.671826147 +0000 UTC m=+0.072668847 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vcs-type=git, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 27 17:25:25 compute-0 podman[225546]: 2026-02-27 17:25:25.705361797 +0000 UTC m=+0.098827605 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 27 17:25:26 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 27 17:25:26 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 27 17:25:26 compute-0 nova_compute[186840]: 2026-02-27 17:25:26.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:25:26 compute-0 nova_compute[186840]: 2026-02-27 17:25:26.806 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:27 compute-0 nova_compute[186840]: 2026-02-27 17:25:27.391 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:27 compute-0 nova_compute[186840]: 2026-02-27 17:25:27.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:25:27 compute-0 nova_compute[186840]: 2026-02-27 17:25:27.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:25:31 compute-0 nova_compute[186840]: 2026-02-27 17:25:31.832 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:32 compute-0 nova_compute[186840]: 2026-02-27 17:25:32.394 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:35 compute-0 podman[225599]: 2026-02-27 17:25:35.661293871 +0000 UTC m=+0.062837176 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, container_name=ceilometer_agent_compute)
Feb 27 17:25:36 compute-0 nova_compute[186840]: 2026-02-27 17:25:36.835 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:37 compute-0 nova_compute[186840]: 2026-02-27 17:25:37.396 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:40 compute-0 podman[225620]: 2026-02-27 17:25:40.661622397 +0000 UTC m=+0.059649227 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:25:41 compute-0 nova_compute[186840]: 2026-02-27 17:25:41.866 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:42 compute-0 nova_compute[186840]: 2026-02-27 17:25:42.398 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:46 compute-0 nova_compute[186840]: 2026-02-27 17:25:46.893 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:25:47.105 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:25:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:25:47.106 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:25:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:25:47.106 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:25:47 compute-0 nova_compute[186840]: 2026-02-27 17:25:47.400 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:51 compute-0 nova_compute[186840]: 2026-02-27 17:25:51.936 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:52 compute-0 nova_compute[186840]: 2026-02-27 17:25:52.402 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:52 compute-0 podman[225645]: 2026-02-27 17:25:52.657223485 +0000 UTC m=+0.056601421 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260223)
Feb 27 17:25:52 compute-0 podman[225644]: 2026-02-27 17:25:52.691731005 +0000 UTC m=+0.092691310 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:25:56 compute-0 podman[225685]: 2026-02-27 17:25:56.679324504 +0000 UTC m=+0.069392199 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 27 17:25:56 compute-0 podman[225686]: 2026-02-27 17:25:56.742055227 +0000 UTC m=+0.129030935 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 27 17:25:56 compute-0 nova_compute[186840]: 2026-02-27 17:25:56.938 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:25:57 compute-0 nova_compute[186840]: 2026-02-27 17:25:57.405 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:01 compute-0 nova_compute[186840]: 2026-02-27 17:26:01.975 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:02 compute-0 nova_compute[186840]: 2026-02-27 17:26:02.406 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.270 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.271 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:26:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:26:06 compute-0 podman[225731]: 2026-02-27 17:26:06.700449627 +0000 UTC m=+0.106286798 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Feb 27 17:26:06 compute-0 nova_compute[186840]: 2026-02-27 17:26:06.978 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:07 compute-0 nova_compute[186840]: 2026-02-27 17:26:07.409 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:11 compute-0 podman[225753]: 2026-02-27 17:26:11.669431193 +0000 UTC m=+0.072049125 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:26:12 compute-0 nova_compute[186840]: 2026-02-27 17:26:12.017 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:12 compute-0 nova_compute[186840]: 2026-02-27 17:26:12.410 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:16 compute-0 nova_compute[186840]: 2026-02-27 17:26:16.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:26:16 compute-0 nova_compute[186840]: 2026-02-27 17:26:16.741 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:26:16 compute-0 nova_compute[186840]: 2026-02-27 17:26:16.742 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:26:16 compute-0 nova_compute[186840]: 2026-02-27 17:26:16.742 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:26:16 compute-0 nova_compute[186840]: 2026-02-27 17:26:16.743 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:26:16 compute-0 nova_compute[186840]: 2026-02-27 17:26:16.946 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:26:16 compute-0 nova_compute[186840]: 2026-02-27 17:26:16.948 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5700MB free_disk=73.1899185180664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:26:16 compute-0 nova_compute[186840]: 2026-02-27 17:26:16.948 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:26:16 compute-0 nova_compute[186840]: 2026-02-27 17:26:16.948 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:26:17 compute-0 nova_compute[186840]: 2026-02-27 17:26:17.018 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:26:17 compute-0 nova_compute[186840]: 2026-02-27 17:26:17.019 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:26:17 compute-0 nova_compute[186840]: 2026-02-27 17:26:17.024 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:17 compute-0 nova_compute[186840]: 2026-02-27 17:26:17.061 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:26:17 compute-0 nova_compute[186840]: 2026-02-27 17:26:17.084 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:26:17 compute-0 nova_compute[186840]: 2026-02-27 17:26:17.086 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:26:17 compute-0 nova_compute[186840]: 2026-02-27 17:26:17.086 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:26:17 compute-0 nova_compute[186840]: 2026-02-27 17:26:17.412 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:18 compute-0 nova_compute[186840]: 2026-02-27 17:26:18.085 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:26:18 compute-0 nova_compute[186840]: 2026-02-27 17:26:18.085 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:26:18 compute-0 nova_compute[186840]: 2026-02-27 17:26:18.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:26:19 compute-0 nova_compute[186840]: 2026-02-27 17:26:19.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:26:19 compute-0 nova_compute[186840]: 2026-02-27 17:26:19.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:26:19 compute-0 nova_compute[186840]: 2026-02-27 17:26:19.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:26:20 compute-0 nova_compute[186840]: 2026-02-27 17:26:20.033 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:26:22 compute-0 nova_compute[186840]: 2026-02-27 17:26:22.021 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:22 compute-0 nova_compute[186840]: 2026-02-27 17:26:22.414 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:22 compute-0 nova_compute[186840]: 2026-02-27 17:26:22.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:26:23 compute-0 podman[225777]: 2026-02-27 17:26:23.64901096 +0000 UTC m=+0.059806121 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:26:23 compute-0 podman[225778]: 2026-02-27 17:26:23.666177068 +0000 UTC m=+0.064189790 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223)
Feb 27 17:26:23 compute-0 nova_compute[186840]: 2026-02-27 17:26:23.695 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:26:26 compute-0 nova_compute[186840]: 2026-02-27 17:26:26.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:26:27 compute-0 nova_compute[186840]: 2026-02-27 17:26:27.025 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:27 compute-0 nova_compute[186840]: 2026-02-27 17:26:27.417 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:27 compute-0 podman[225821]: 2026-02-27 17:26:27.678080752 +0000 UTC m=+0.080507017 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 27 17:26:27 compute-0 podman[225822]: 2026-02-27 17:26:27.719000061 +0000 UTC m=+0.120417491 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223)
Feb 27 17:26:28 compute-0 nova_compute[186840]: 2026-02-27 17:26:28.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:26:28 compute-0 nova_compute[186840]: 2026-02-27 17:26:28.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:26:32 compute-0 nova_compute[186840]: 2026-02-27 17:26:32.062 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:32 compute-0 nova_compute[186840]: 2026-02-27 17:26:32.418 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:37 compute-0 nova_compute[186840]: 2026-02-27 17:26:37.064 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:37 compute-0 nova_compute[186840]: 2026-02-27 17:26:37.421 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:37 compute-0 podman[225868]: 2026-02-27 17:26:37.666426691 +0000 UTC m=+0.068971759 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Feb 27 17:26:42 compute-0 nova_compute[186840]: 2026-02-27 17:26:42.091 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:42 compute-0 nova_compute[186840]: 2026-02-27 17:26:42.422 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:42 compute-0 podman[225888]: 2026-02-27 17:26:42.664955993 +0000 UTC m=+0.064609631 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 27 17:26:47 compute-0 nova_compute[186840]: 2026-02-27 17:26:47.094 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:26:47.106 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:26:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:26:47.107 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:26:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:26:47.107 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:26:47 compute-0 nova_compute[186840]: 2026-02-27 17:26:47.424 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:52 compute-0 nova_compute[186840]: 2026-02-27 17:26:52.094 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:52 compute-0 nova_compute[186840]: 2026-02-27 17:26:52.425 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:54 compute-0 podman[225912]: 2026-02-27 17:26:54.670983508 +0000 UTC m=+0.063214146 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 27 17:26:54 compute-0 podman[225913]: 2026-02-27 17:26:54.675376647 +0000 UTC m=+0.060450727 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 27 17:26:57 compute-0 nova_compute[186840]: 2026-02-27 17:26:57.099 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:57 compute-0 nova_compute[186840]: 2026-02-27 17:26:57.430 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:26:58 compute-0 podman[225955]: 2026-02-27 17:26:58.667841299 +0000 UTC m=+0.068946599 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, version=9.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Feb 27 17:26:58 compute-0 podman[225956]: 2026-02-27 17:26:58.70842555 +0000 UTC m=+0.104245508 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260223, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible)
Feb 27 17:27:02 compute-0 nova_compute[186840]: 2026-02-27 17:27:02.103 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:02 compute-0 nova_compute[186840]: 2026-02-27 17:27:02.432 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:07 compute-0 nova_compute[186840]: 2026-02-27 17:27:07.175 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:07 compute-0 nova_compute[186840]: 2026-02-27 17:27:07.435 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:08 compute-0 podman[226002]: 2026-02-27 17:27:08.662308527 +0000 UTC m=+0.066382315 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible, container_name=ceilometer_agent_compute)
Feb 27 17:27:12 compute-0 nova_compute[186840]: 2026-02-27 17:27:12.178 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:12 compute-0 nova_compute[186840]: 2026-02-27 17:27:12.436 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:13 compute-0 podman[226022]: 2026-02-27 17:27:13.656387019 +0000 UTC m=+0.057313669 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 27 17:27:17 compute-0 nova_compute[186840]: 2026-02-27 17:27:17.211 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:17 compute-0 nova_compute[186840]: 2026-02-27 17:27:17.437 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:18 compute-0 nova_compute[186840]: 2026-02-27 17:27:18.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:18 compute-0 nova_compute[186840]: 2026-02-27 17:27:18.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:18 compute-0 nova_compute[186840]: 2026-02-27 17:27:18.701 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:18 compute-0 nova_compute[186840]: 2026-02-27 17:27:18.747 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:27:18 compute-0 nova_compute[186840]: 2026-02-27 17:27:18.749 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:27:18 compute-0 nova_compute[186840]: 2026-02-27 17:27:18.749 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:27:18 compute-0 nova_compute[186840]: 2026-02-27 17:27:18.750 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:27:18 compute-0 nova_compute[186840]: 2026-02-27 17:27:18.995 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:27:18 compute-0 nova_compute[186840]: 2026-02-27 17:27:18.997 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5723MB free_disk=73.1899185180664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:27:18 compute-0 nova_compute[186840]: 2026-02-27 17:27:18.997 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:27:18 compute-0 nova_compute[186840]: 2026-02-27 17:27:18.998 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:27:19 compute-0 nova_compute[186840]: 2026-02-27 17:27:19.245 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:27:19 compute-0 nova_compute[186840]: 2026-02-27 17:27:19.246 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:27:19 compute-0 nova_compute[186840]: 2026-02-27 17:27:19.361 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:27:19 compute-0 nova_compute[186840]: 2026-02-27 17:27:19.390 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:27:19 compute-0 nova_compute[186840]: 2026-02-27 17:27:19.392 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:27:19 compute-0 nova_compute[186840]: 2026-02-27 17:27:19.393 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:27:20 compute-0 nova_compute[186840]: 2026-02-27 17:27:20.392 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:20 compute-0 nova_compute[186840]: 2026-02-27 17:27:20.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:20 compute-0 nova_compute[186840]: 2026-02-27 17:27:20.701 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:27:20 compute-0 nova_compute[186840]: 2026-02-27 17:27:20.701 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:27:20 compute-0 nova_compute[186840]: 2026-02-27 17:27:20.727 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:27:22 compute-0 nova_compute[186840]: 2026-02-27 17:27:22.254 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:22 compute-0 nova_compute[186840]: 2026-02-27 17:27:22.439 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:22 compute-0 nova_compute[186840]: 2026-02-27 17:27:22.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:23 compute-0 nova_compute[186840]: 2026-02-27 17:27:23.694 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:25 compute-0 podman[226046]: 2026-02-27 17:27:25.663918162 +0000 UTC m=+0.066457127 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 27 17:27:25 compute-0 podman[226047]: 2026-02-27 17:27:25.703660472 +0000 UTC m=+0.098844403 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 27 17:27:26 compute-0 nova_compute[186840]: 2026-02-27 17:27:26.694 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:26 compute-0 nova_compute[186840]: 2026-02-27 17:27:26.721 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:27 compute-0 nova_compute[186840]: 2026-02-27 17:27:27.256 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:27 compute-0 nova_compute[186840]: 2026-02-27 17:27:27.442 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:27 compute-0 nova_compute[186840]: 2026-02-27 17:27:27.710 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:27 compute-0 nova_compute[186840]: 2026-02-27 17:27:27.711 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 27 17:27:28 compute-0 nova_compute[186840]: 2026-02-27 17:27:28.721 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:28 compute-0 nova_compute[186840]: 2026-02-27 17:27:28.722 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:28 compute-0 nova_compute[186840]: 2026-02-27 17:27:28.723 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:27:28 compute-0 nova_compute[186840]: 2026-02-27 17:27:28.723 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:28 compute-0 nova_compute[186840]: 2026-02-27 17:27:28.724 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 27 17:27:28 compute-0 nova_compute[186840]: 2026-02-27 17:27:28.753 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 27 17:27:29 compute-0 podman[226089]: 2026-02-27 17:27:29.690701989 +0000 UTC m=+0.081386069 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, name=ubi9/ubi-minimal, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Feb 27 17:27:29 compute-0 podman[226090]: 2026-02-27 17:27:29.769373268 +0000 UTC m=+0.155703549 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.43.0)
Feb 27 17:27:32 compute-0 nova_compute[186840]: 2026-02-27 17:27:32.266 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:32 compute-0 nova_compute[186840]: 2026-02-27 17:27:32.443 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:37 compute-0 nova_compute[186840]: 2026-02-27 17:27:37.289 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:37 compute-0 nova_compute[186840]: 2026-02-27 17:27:37.445 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:39 compute-0 podman[226135]: 2026-02-27 17:27:39.683554408 +0000 UTC m=+0.078623080 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 27 17:27:42 compute-0 nova_compute[186840]: 2026-02-27 17:27:42.325 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:42 compute-0 nova_compute[186840]: 2026-02-27 17:27:42.447 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:42 compute-0 nova_compute[186840]: 2026-02-27 17:27:42.554 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:27:44 compute-0 podman[226155]: 2026-02-27 17:27:44.69650432 +0000 UTC m=+0.099965742 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 27 17:27:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:27:47.107 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:27:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:27:47.107 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:27:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:27:47.108 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:27:47 compute-0 nova_compute[186840]: 2026-02-27 17:27:47.371 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:47 compute-0 nova_compute[186840]: 2026-02-27 17:27:47.449 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:52 compute-0 nova_compute[186840]: 2026-02-27 17:27:52.372 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:52 compute-0 nova_compute[186840]: 2026-02-27 17:27:52.450 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:56 compute-0 podman[226179]: 2026-02-27 17:27:56.6581515 +0000 UTC m=+0.064948769 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 27 17:27:56 compute-0 podman[226180]: 2026-02-27 17:27:56.695111501 +0000 UTC m=+0.096378102 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260223)
Feb 27 17:27:57 compute-0 nova_compute[186840]: 2026-02-27 17:27:57.375 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:27:57 compute-0 nova_compute[186840]: 2026-02-27 17:27:57.451 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:00 compute-0 podman[226222]: 2026-02-27 17:28:00.67503557 +0000 UTC m=+0.077748148 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 27 17:28:00 compute-0 podman[226223]: 2026-02-27 17:28:00.712623736 +0000 UTC m=+0.107671883 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:28:02 compute-0 nova_compute[186840]: 2026-02-27 17:28:02.379 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:02 compute-0 nova_compute[186840]: 2026-02-27 17:28:02.452 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:28:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:28:07 compute-0 nova_compute[186840]: 2026-02-27 17:28:07.437 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:07 compute-0 nova_compute[186840]: 2026-02-27 17:28:07.454 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:10 compute-0 podman[226267]: 2026-02-27 17:28:10.659613374 +0000 UTC m=+0.066479407 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute)
Feb 27 17:28:12 compute-0 nova_compute[186840]: 2026-02-27 17:28:12.439 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:12 compute-0 nova_compute[186840]: 2026-02-27 17:28:12.455 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:15 compute-0 podman[226288]: 2026-02-27 17:28:15.67965005 +0000 UTC m=+0.074515627 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:28:17 compute-0 nova_compute[186840]: 2026-02-27 17:28:17.456 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:17 compute-0 nova_compute[186840]: 2026-02-27 17:28:17.458 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:17 compute-0 nova_compute[186840]: 2026-02-27 17:28:17.458 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:28:17 compute-0 nova_compute[186840]: 2026-02-27 17:28:17.458 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:28:17 compute-0 nova_compute[186840]: 2026-02-27 17:28:17.479 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:17 compute-0 nova_compute[186840]: 2026-02-27 17:28:17.480 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:28:18 compute-0 nova_compute[186840]: 2026-02-27 17:28:18.726 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:28:18 compute-0 nova_compute[186840]: 2026-02-27 17:28:18.727 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:28:19 compute-0 nova_compute[186840]: 2026-02-27 17:28:19.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:28:19 compute-0 nova_compute[186840]: 2026-02-27 17:28:19.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:28:19 compute-0 nova_compute[186840]: 2026-02-27 17:28:19.741 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:28:19 compute-0 nova_compute[186840]: 2026-02-27 17:28:19.741 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:28:19 compute-0 nova_compute[186840]: 2026-02-27 17:28:19.742 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:28:19 compute-0 nova_compute[186840]: 2026-02-27 17:28:19.742 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:28:19 compute-0 nova_compute[186840]: 2026-02-27 17:28:19.941 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:28:19 compute-0 nova_compute[186840]: 2026-02-27 17:28:19.943 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5747MB free_disk=73.1900520324707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:28:19 compute-0 nova_compute[186840]: 2026-02-27 17:28:19.943 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:28:19 compute-0 nova_compute[186840]: 2026-02-27 17:28:19.944 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:28:20 compute-0 nova_compute[186840]: 2026-02-27 17:28:20.181 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:28:20 compute-0 nova_compute[186840]: 2026-02-27 17:28:20.182 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:28:20 compute-0 nova_compute[186840]: 2026-02-27 17:28:20.200 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Refreshing inventories for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 27 17:28:20 compute-0 nova_compute[186840]: 2026-02-27 17:28:20.266 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Updating ProviderTree inventory for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 27 17:28:20 compute-0 nova_compute[186840]: 2026-02-27 17:28:20.266 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Updating inventory in ProviderTree for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 27 17:28:20 compute-0 nova_compute[186840]: 2026-02-27 17:28:20.292 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Refreshing aggregate associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 27 17:28:20 compute-0 nova_compute[186840]: 2026-02-27 17:28:20.334 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Refreshing trait associations for resource provider 2b4df47a-58ba-41db-b94b-eb594c2f9699, traits: HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE42,HW_CPU_X86_F16C,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_AMD_SVM,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_IDE,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_AESNI,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SHA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AVX2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_BMI,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_CLMUL,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 27 17:28:20 compute-0 nova_compute[186840]: 2026-02-27 17:28:20.384 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:28:20 compute-0 nova_compute[186840]: 2026-02-27 17:28:20.404 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:28:20 compute-0 nova_compute[186840]: 2026-02-27 17:28:20.407 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:28:20 compute-0 nova_compute[186840]: 2026-02-27 17:28:20.407 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.463s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:28:22 compute-0 nova_compute[186840]: 2026-02-27 17:28:22.408 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:28:22 compute-0 nova_compute[186840]: 2026-02-27 17:28:22.409 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:28:22 compute-0 nova_compute[186840]: 2026-02-27 17:28:22.409 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:28:22 compute-0 nova_compute[186840]: 2026-02-27 17:28:22.429 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:28:22 compute-0 nova_compute[186840]: 2026-02-27 17:28:22.481 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:22 compute-0 nova_compute[186840]: 2026-02-27 17:28:22.483 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:22 compute-0 nova_compute[186840]: 2026-02-27 17:28:22.483 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:28:22 compute-0 nova_compute[186840]: 2026-02-27 17:28:22.483 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:28:22 compute-0 nova_compute[186840]: 2026-02-27 17:28:22.513 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:22 compute-0 nova_compute[186840]: 2026-02-27 17:28:22.514 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:28:24 compute-0 nova_compute[186840]: 2026-02-27 17:28:24.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:28:24 compute-0 nova_compute[186840]: 2026-02-27 17:28:24.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:28:27 compute-0 nova_compute[186840]: 2026-02-27 17:28:27.515 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:27 compute-0 podman[226313]: 2026-02-27 17:28:27.664331476 +0000 UTC m=+0.072092607 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:28:27 compute-0 podman[226314]: 2026-02-27 17:28:27.677049943 +0000 UTC m=+0.073172704 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 27 17:28:29 compute-0 nova_compute[186840]: 2026-02-27 17:28:29.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:28:29 compute-0 nova_compute[186840]: 2026-02-27 17:28:29.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:28:30 compute-0 nova_compute[186840]: 2026-02-27 17:28:30.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:28:31 compute-0 podman[226357]: 2026-02-27 17:28:31.674376004 +0000 UTC m=+0.079886231 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, name=ubi9/ubi-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Feb 27 17:28:31 compute-0 podman[226358]: 2026-02-27 17:28:31.694433004 +0000 UTC m=+0.101347286 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.43.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 27 17:28:32 compute-0 nova_compute[186840]: 2026-02-27 17:28:32.516 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:32 compute-0 nova_compute[186840]: 2026-02-27 17:28:32.518 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:32 compute-0 nova_compute[186840]: 2026-02-27 17:28:32.518 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:28:32 compute-0 nova_compute[186840]: 2026-02-27 17:28:32.519 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:28:32 compute-0 nova_compute[186840]: 2026-02-27 17:28:32.554 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:32 compute-0 nova_compute[186840]: 2026-02-27 17:28:32.555 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:28:37 compute-0 nova_compute[186840]: 2026-02-27 17:28:37.556 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:37 compute-0 nova_compute[186840]: 2026-02-27 17:28:37.557 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:37 compute-0 nova_compute[186840]: 2026-02-27 17:28:37.558 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:28:37 compute-0 nova_compute[186840]: 2026-02-27 17:28:37.558 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:28:37 compute-0 nova_compute[186840]: 2026-02-27 17:28:37.617 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:37 compute-0 nova_compute[186840]: 2026-02-27 17:28:37.618 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:28:41 compute-0 podman[226406]: 2026-02-27 17:28:41.685007177 +0000 UTC m=+0.088978467 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260223, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 27 17:28:42 compute-0 nova_compute[186840]: 2026-02-27 17:28:42.619 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:42 compute-0 nova_compute[186840]: 2026-02-27 17:28:42.621 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:42 compute-0 nova_compute[186840]: 2026-02-27 17:28:42.621 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:28:42 compute-0 nova_compute[186840]: 2026-02-27 17:28:42.621 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:28:42 compute-0 nova_compute[186840]: 2026-02-27 17:28:42.655 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:42 compute-0 nova_compute[186840]: 2026-02-27 17:28:42.656 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:28:46 compute-0 podman[226428]: 2026-02-27 17:28:46.654446114 +0000 UTC m=+0.062094248 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:28:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:28:47.108 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:28:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:28:47.109 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:28:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:28:47.109 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:28:47 compute-0 nova_compute[186840]: 2026-02-27 17:28:47.657 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:49 compute-0 nova_compute[186840]: 2026-02-27 17:28:49.392 186844 DEBUG oslo_concurrency.processutils [None req-5b4c4a99-4e5f-46a1-be4a-370cfc030c84 f2368cc93e964454bb4a8463a4b797ac 2d107a7a633145b79e8da4348c7f2d65 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 27 17:28:49 compute-0 nova_compute[186840]: 2026-02-27 17:28:49.414 186844 DEBUG oslo_concurrency.processutils [None req-5b4c4a99-4e5f-46a1-be4a-370cfc030c84 f2368cc93e964454bb4a8463a4b797ac 2d107a7a633145b79e8da4348c7f2d65 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 27 17:28:52 compute-0 nova_compute[186840]: 2026-02-27 17:28:52.659 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:28:56 compute-0 nova_compute[186840]: 2026-02-27 17:28:56.184 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:56 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:28:56.185 106085 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'fa:fd:4b', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'ae:7c:62:22:96:a0'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 27 17:28:56 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:28:56.187 106085 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 27 17:28:57 compute-0 nova_compute[186840]: 2026-02-27 17:28:57.719 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:28:58 compute-0 podman[226456]: 2026-02-27 17:28:58.679972686 +0000 UTC m=+0.074959288 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:28:58 compute-0 podman[226455]: 2026-02-27 17:28:58.695443141 +0000 UTC m=+0.096224438 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:29:02 compute-0 podman[226498]: 2026-02-27 17:29:02.685436682 +0000 UTC m=+0.085903041 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, release=1770267347)
Feb 27 17:29:02 compute-0 podman[226499]: 2026-02-27 17:29:02.711289096 +0000 UTC m=+0.105907189 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 27 17:29:02 compute-0 nova_compute[186840]: 2026-02-27 17:29:02.761 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:29:02 compute-0 nova_compute[186840]: 2026-02-27 17:29:02.762 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:29:02 compute-0 nova_compute[186840]: 2026-02-27 17:29:02.762 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:29:02 compute-0 nova_compute[186840]: 2026-02-27 17:29:02.762 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:29:02 compute-0 nova_compute[186840]: 2026-02-27 17:29:02.763 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:29:04 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:29:04.191 106085 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=114486db-e8a8-4651-8c2f-bcfde6c6e156, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 27 17:29:07 compute-0 nova_compute[186840]: 2026-02-27 17:29:07.766 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:29:12 compute-0 podman[226547]: 2026-02-27 17:29:12.66783099 +0000 UTC m=+0.072824585 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 27 17:29:12 compute-0 nova_compute[186840]: 2026-02-27 17:29:12.769 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:29:12 compute-0 nova_compute[186840]: 2026-02-27 17:29:12.770 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:12 compute-0 nova_compute[186840]: 2026-02-27 17:29:12.770 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:29:12 compute-0 nova_compute[186840]: 2026-02-27 17:29:12.770 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:29:12 compute-0 nova_compute[186840]: 2026-02-27 17:29:12.771 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:29:12 compute-0 nova_compute[186840]: 2026-02-27 17:29:12.772 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:17 compute-0 podman[226567]: 2026-02-27 17:29:17.671581202 +0000 UTC m=+0.073833021 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:29:17 compute-0 nova_compute[186840]: 2026-02-27 17:29:17.773 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:29:17 compute-0 nova_compute[186840]: 2026-02-27 17:29:17.775 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:29:17 compute-0 nova_compute[186840]: 2026-02-27 17:29:17.775 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:29:17 compute-0 nova_compute[186840]: 2026-02-27 17:29:17.775 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:29:17 compute-0 nova_compute[186840]: 2026-02-27 17:29:17.777 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:17 compute-0 nova_compute[186840]: 2026-02-27 17:29:17.777 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:29:20 compute-0 nova_compute[186840]: 2026-02-27 17:29:20.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:29:20 compute-0 nova_compute[186840]: 2026-02-27 17:29:20.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:29:20 compute-0 nova_compute[186840]: 2026-02-27 17:29:20.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:29:20 compute-0 nova_compute[186840]: 2026-02-27 17:29:20.745 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:29:20 compute-0 nova_compute[186840]: 2026-02-27 17:29:20.745 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:29:20 compute-0 nova_compute[186840]: 2026-02-27 17:29:20.745 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:29:20 compute-0 nova_compute[186840]: 2026-02-27 17:29:20.746 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:29:20 compute-0 nova_compute[186840]: 2026-02-27 17:29:20.922 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:29:20 compute-0 nova_compute[186840]: 2026-02-27 17:29:20.923 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5744MB free_disk=73.19007110595703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:29:20 compute-0 nova_compute[186840]: 2026-02-27 17:29:20.924 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:29:20 compute-0 nova_compute[186840]: 2026-02-27 17:29:20.924 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:29:21 compute-0 nova_compute[186840]: 2026-02-27 17:29:21.063 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:29:21 compute-0 nova_compute[186840]: 2026-02-27 17:29:21.064 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:29:21 compute-0 nova_compute[186840]: 2026-02-27 17:29:21.092 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:29:21 compute-0 nova_compute[186840]: 2026-02-27 17:29:21.115 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:29:21 compute-0 nova_compute[186840]: 2026-02-27 17:29:21.118 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:29:21 compute-0 nova_compute[186840]: 2026-02-27 17:29:21.118 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:29:22 compute-0 nova_compute[186840]: 2026-02-27 17:29:22.119 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:29:22 compute-0 nova_compute[186840]: 2026-02-27 17:29:22.778 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:29:22 compute-0 nova_compute[186840]: 2026-02-27 17:29:22.779 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:29:23 compute-0 nova_compute[186840]: 2026-02-27 17:29:23.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:29:23 compute-0 nova_compute[186840]: 2026-02-27 17:29:23.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:29:23 compute-0 nova_compute[186840]: 2026-02-27 17:29:23.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:29:23 compute-0 nova_compute[186840]: 2026-02-27 17:29:23.793 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:29:24 compute-0 nova_compute[186840]: 2026-02-27 17:29:24.790 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:29:26 compute-0 nova_compute[186840]: 2026-02-27 17:29:26.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:29:27 compute-0 nova_compute[186840]: 2026-02-27 17:29:27.779 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:29:27 compute-0 nova_compute[186840]: 2026-02-27 17:29:27.781 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:27 compute-0 nova_compute[186840]: 2026-02-27 17:29:27.781 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:29:27 compute-0 nova_compute[186840]: 2026-02-27 17:29:27.782 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:29:27 compute-0 nova_compute[186840]: 2026-02-27 17:29:27.783 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:29:27 compute-0 nova_compute[186840]: 2026-02-27 17:29:27.784 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:29 compute-0 podman[226591]: 2026-02-27 17:29:29.666828519 +0000 UTC m=+0.069924062 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:29:29 compute-0 podman[226592]: 2026-02-27 17:29:29.671222309 +0000 UTC m=+0.071857821 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 27 17:29:30 compute-0 nova_compute[186840]: 2026-02-27 17:29:30.696 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:29:31 compute-0 nova_compute[186840]: 2026-02-27 17:29:31.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:29:31 compute-0 nova_compute[186840]: 2026-02-27 17:29:31.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:29:32 compute-0 nova_compute[186840]: 2026-02-27 17:29:32.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:29:32 compute-0 nova_compute[186840]: 2026-02-27 17:29:32.782 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:32 compute-0 nova_compute[186840]: 2026-02-27 17:29:32.785 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:33 compute-0 podman[226631]: 2026-02-27 17:29:33.67376588 +0000 UTC m=+0.079127142 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.7, architecture=x86_64, build-date=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 27 17:29:33 compute-0 podman[226632]: 2026-02-27 17:29:33.713441468 +0000 UTC m=+0.114264257 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 27 17:29:37 compute-0 nova_compute[186840]: 2026-02-27 17:29:37.785 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:42 compute-0 nova_compute[186840]: 2026-02-27 17:29:42.786 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:42 compute-0 nova_compute[186840]: 2026-02-27 17:29:42.787 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:43 compute-0 podman[226679]: 2026-02-27 17:29:43.659271947 +0000 UTC m=+0.067255057 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:29:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:29:47.110 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:29:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:29:47.110 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:29:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:29:47.110 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:29:47 compute-0 nova_compute[186840]: 2026-02-27 17:29:47.788 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:48 compute-0 podman[226701]: 2026-02-27 17:29:48.694525863 +0000 UTC m=+0.098806845 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:29:52 compute-0 nova_compute[186840]: 2026-02-27 17:29:52.790 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:57 compute-0 nova_compute[186840]: 2026-02-27 17:29:57.792 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:29:57 compute-0 nova_compute[186840]: 2026-02-27 17:29:57.795 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:29:57 compute-0 nova_compute[186840]: 2026-02-27 17:29:57.795 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:29:57 compute-0 nova_compute[186840]: 2026-02-27 17:29:57.795 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:29:57 compute-0 nova_compute[186840]: 2026-02-27 17:29:57.817 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:29:57 compute-0 nova_compute[186840]: 2026-02-27 17:29:57.818 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:30:00 compute-0 podman[226727]: 2026-02-27 17:30:00.67923174 +0000 UTC m=+0.078581224 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 27 17:30:00 compute-0 podman[226728]: 2026-02-27 17:30:00.694435844 +0000 UTC m=+0.088306170 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true)
Feb 27 17:30:02 compute-0 nova_compute[186840]: 2026-02-27 17:30:02.818 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:30:02 compute-0 nova_compute[186840]: 2026-02-27 17:30:02.821 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:02 compute-0 nova_compute[186840]: 2026-02-27 17:30:02.821 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:30:02 compute-0 nova_compute[186840]: 2026-02-27 17:30:02.821 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:30:02 compute-0 nova_compute[186840]: 2026-02-27 17:30:02.822 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:30:02 compute-0 nova_compute[186840]: 2026-02-27 17:30:02.824 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:30:04 compute-0 podman[226769]: 2026-02-27 17:30:04.672655949 +0000 UTC m=+0.075320992 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git)
Feb 27 17:30:04 compute-0 podman[226770]: 2026-02-27 17:30:04.714546376 +0000 UTC m=+0.108653343 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:30:05.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:30:07 compute-0 nova_compute[186840]: 2026-02-27 17:30:07.821 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:07 compute-0 nova_compute[186840]: 2026-02-27 17:30:07.823 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:12 compute-0 nova_compute[186840]: 2026-02-27 17:30:12.823 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:12 compute-0 nova_compute[186840]: 2026-02-27 17:30:12.825 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:14 compute-0 podman[226814]: 2026-02-27 17:30:14.670833349 +0000 UTC m=+0.076144093 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0)
Feb 27 17:30:17 compute-0 nova_compute[186840]: 2026-02-27 17:30:17.824 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:17 compute-0 nova_compute[186840]: 2026-02-27 17:30:17.826 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:19 compute-0 podman[226835]: 2026-02-27 17:30:19.66606147 +0000 UTC m=+0.068989331 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:30:21 compute-0 nova_compute[186840]: 2026-02-27 17:30:21.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:30:21 compute-0 nova_compute[186840]: 2026-02-27 17:30:21.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:30:21 compute-0 nova_compute[186840]: 2026-02-27 17:30:21.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:30:22 compute-0 nova_compute[186840]: 2026-02-27 17:30:22.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:30:22 compute-0 nova_compute[186840]: 2026-02-27 17:30:22.734 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:30:22 compute-0 nova_compute[186840]: 2026-02-27 17:30:22.735 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:30:22 compute-0 nova_compute[186840]: 2026-02-27 17:30:22.735 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:30:22 compute-0 nova_compute[186840]: 2026-02-27 17:30:22.735 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:30:22 compute-0 nova_compute[186840]: 2026-02-27 17:30:22.827 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:22 compute-0 nova_compute[186840]: 2026-02-27 17:30:22.966 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:30:22 compute-0 nova_compute[186840]: 2026-02-27 17:30:22.968 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5746MB free_disk=73.1900520324707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:30:22 compute-0 nova_compute[186840]: 2026-02-27 17:30:22.968 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:30:22 compute-0 nova_compute[186840]: 2026-02-27 17:30:22.969 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:30:23 compute-0 nova_compute[186840]: 2026-02-27 17:30:23.161 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:30:23 compute-0 nova_compute[186840]: 2026-02-27 17:30:23.162 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:30:23 compute-0 nova_compute[186840]: 2026-02-27 17:30:23.192 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:30:23 compute-0 nova_compute[186840]: 2026-02-27 17:30:23.220 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:30:23 compute-0 nova_compute[186840]: 2026-02-27 17:30:23.222 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:30:23 compute-0 nova_compute[186840]: 2026-02-27 17:30:23.223 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:30:25 compute-0 nova_compute[186840]: 2026-02-27 17:30:25.223 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:30:25 compute-0 nova_compute[186840]: 2026-02-27 17:30:25.223 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:30:25 compute-0 nova_compute[186840]: 2026-02-27 17:30:25.224 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:30:25 compute-0 nova_compute[186840]: 2026-02-27 17:30:25.248 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:30:26 compute-0 nova_compute[186840]: 2026-02-27 17:30:26.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:30:26 compute-0 nova_compute[186840]: 2026-02-27 17:30:26.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:30:27 compute-0 nova_compute[186840]: 2026-02-27 17:30:27.827 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:31 compute-0 podman[226861]: 2026-02-27 17:30:31.676877177 +0000 UTC m=+0.069469325 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260223)
Feb 27 17:30:31 compute-0 nova_compute[186840]: 2026-02-27 17:30:31.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:30:31 compute-0 nova_compute[186840]: 2026-02-27 17:30:31.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:30:31 compute-0 podman[226860]: 2026-02-27 17:30:31.700415031 +0000 UTC m=+0.099777020 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:30:32 compute-0 nova_compute[186840]: 2026-02-27 17:30:32.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:30:32 compute-0 nova_compute[186840]: 2026-02-27 17:30:32.829 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:35 compute-0 podman[226901]: 2026-02-27 17:30:35.657850192 +0000 UTC m=+0.061989146 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., distribution-scope=public, release=1770267347, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 27 17:30:35 compute-0 podman[226902]: 2026-02-27 17:30:35.690934687 +0000 UTC m=+0.085314225 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 27 17:30:37 compute-0 nova_compute[186840]: 2026-02-27 17:30:37.831 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:42 compute-0 nova_compute[186840]: 2026-02-27 17:30:42.833 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:30:42 compute-0 nova_compute[186840]: 2026-02-27 17:30:42.834 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:42 compute-0 nova_compute[186840]: 2026-02-27 17:30:42.834 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:30:42 compute-0 nova_compute[186840]: 2026-02-27 17:30:42.834 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:30:42 compute-0 nova_compute[186840]: 2026-02-27 17:30:42.834 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:30:42 compute-0 nova_compute[186840]: 2026-02-27 17:30:42.835 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:45 compute-0 podman[226949]: 2026-02-27 17:30:45.704471513 +0000 UTC m=+0.100666042 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 27 17:30:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:30:47.111 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:30:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:30:47.112 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:30:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:30:47.112 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:30:47 compute-0 nova_compute[186840]: 2026-02-27 17:30:47.835 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:50 compute-0 podman[226970]: 2026-02-27 17:30:50.684338698 +0000 UTC m=+0.082387221 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:30:52 compute-0 nova_compute[186840]: 2026-02-27 17:30:52.837 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:30:57 compute-0 nova_compute[186840]: 2026-02-27 17:30:57.839 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:02 compute-0 podman[226994]: 2026-02-27 17:31:02.688418802 +0000 UTC m=+0.088185406 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 27 17:31:02 compute-0 podman[226995]: 2026-02-27 17:31:02.727620162 +0000 UTC m=+0.119039096 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 27 17:31:02 compute-0 nova_compute[186840]: 2026-02-27 17:31:02.841 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:06 compute-0 podman[227039]: 2026-02-27 17:31:06.70558766 +0000 UTC m=+0.105948705 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, release=1770267347, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.7, architecture=x86_64, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 27 17:31:06 compute-0 podman[227040]: 2026-02-27 17:31:06.738017358 +0000 UTC m=+0.133701315 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:31:07 compute-0 nova_compute[186840]: 2026-02-27 17:31:07.842 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:12 compute-0 nova_compute[186840]: 2026-02-27 17:31:12.844 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:16 compute-0 podman[227085]: 2026-02-27 17:31:16.679701583 +0000 UTC m=+0.079997100 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0)
Feb 27 17:31:17 compute-0 nova_compute[186840]: 2026-02-27 17:31:17.848 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:21 compute-0 podman[227107]: 2026-02-27 17:31:21.669206781 +0000 UTC m=+0.066071659 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 27 17:31:21 compute-0 nova_compute[186840]: 2026-02-27 17:31:21.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:31:22 compute-0 nova_compute[186840]: 2026-02-27 17:31:22.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:31:22 compute-0 nova_compute[186840]: 2026-02-27 17:31:22.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:31:22 compute-0 nova_compute[186840]: 2026-02-27 17:31:22.850 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:22 compute-0 nova_compute[186840]: 2026-02-27 17:31:22.853 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.732 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.733 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.775 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.776 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.776 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.776 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.947 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.949 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5739MB free_disk=73.1900520324707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.949 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:31:24 compute-0 nova_compute[186840]: 2026-02-27 17:31:24.949 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:31:25 compute-0 nova_compute[186840]: 2026-02-27 17:31:25.044 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:31:25 compute-0 nova_compute[186840]: 2026-02-27 17:31:25.045 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:31:25 compute-0 nova_compute[186840]: 2026-02-27 17:31:25.074 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:31:25 compute-0 nova_compute[186840]: 2026-02-27 17:31:25.122 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:31:25 compute-0 nova_compute[186840]: 2026-02-27 17:31:25.124 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:31:25 compute-0 nova_compute[186840]: 2026-02-27 17:31:25.125 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:31:27 compute-0 nova_compute[186840]: 2026-02-27 17:31:27.120 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:31:27 compute-0 nova_compute[186840]: 2026-02-27 17:31:27.701 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:31:27 compute-0 nova_compute[186840]: 2026-02-27 17:31:27.851 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:27 compute-0 nova_compute[186840]: 2026-02-27 17:31:27.855 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:31 compute-0 nova_compute[186840]: 2026-02-27 17:31:31.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:31:31 compute-0 nova_compute[186840]: 2026-02-27 17:31:31.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:31:32 compute-0 nova_compute[186840]: 2026-02-27 17:31:32.856 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:33 compute-0 podman[227132]: 2026-02-27 17:31:33.6655776 +0000 UTC m=+0.069763052 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 27 17:31:33 compute-0 podman[227133]: 2026-02-27 17:31:33.673148561 +0000 UTC m=+0.071395653 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.43.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent)
Feb 27 17:31:34 compute-0 nova_compute[186840]: 2026-02-27 17:31:34.695 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:31:34 compute-0 nova_compute[186840]: 2026-02-27 17:31:34.728 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:31:37 compute-0 podman[227173]: 2026-02-27 17:31:37.711481352 +0000 UTC m=+0.108871859 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., version=9.7, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 27 17:31:37 compute-0 podman[227174]: 2026-02-27 17:31:37.740299159 +0000 UTC m=+0.135331816 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, io.buildah.version=1.43.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 27 17:31:37 compute-0 nova_compute[186840]: 2026-02-27 17:31:37.857 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:42 compute-0 nova_compute[186840]: 2026-02-27 17:31:42.860 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:31:47.112 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:31:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:31:47.113 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:31:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:31:47.113 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:31:47 compute-0 podman[227221]: 2026-02-27 17:31:47.673042298 +0000 UTC m=+0.075036104 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 27 17:31:47 compute-0 nova_compute[186840]: 2026-02-27 17:31:47.862 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:31:52 compute-0 podman[227241]: 2026-02-27 17:31:52.670169289 +0000 UTC m=+0.067350481 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 27 17:31:52 compute-0 nova_compute[186840]: 2026-02-27 17:31:52.864 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:31:57 compute-0 nova_compute[186840]: 2026-02-27 17:31:57.865 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:02 compute-0 nova_compute[186840]: 2026-02-27 17:32:02.867 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:32:02 compute-0 nova_compute[186840]: 2026-02-27 17:32:02.868 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:02 compute-0 nova_compute[186840]: 2026-02-27 17:32:02.868 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:32:02 compute-0 nova_compute[186840]: 2026-02-27 17:32:02.869 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:32:02 compute-0 nova_compute[186840]: 2026-02-27 17:32:02.869 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:32:02 compute-0 nova_compute[186840]: 2026-02-27 17:32:02.871 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:04 compute-0 podman[227267]: 2026-02-27 17:32:04.678485892 +0000 UTC m=+0.071049104 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb)
Feb 27 17:32:04 compute-0 podman[227266]: 2026-02-27 17:32:04.686302619 +0000 UTC m=+0.079535278 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.273 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.276 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.277 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.279 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.279 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.279 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.279 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:05 compute-0 ceilometer_agent_compute[196578]: 2026-02-27 17:32:05.279 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 27 17:32:07 compute-0 nova_compute[186840]: 2026-02-27 17:32:07.870 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:08 compute-0 podman[227308]: 2026-02-27 17:32:08.676790303 +0000 UTC m=+0.073732862 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 27 17:32:08 compute-0 podman[227309]: 2026-02-27 17:32:08.711400887 +0000 UTC m=+0.105073083 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.43.0, org.label-schema.license=GPLv2)
Feb 27 17:32:12 compute-0 nova_compute[186840]: 2026-02-27 17:32:12.872 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:17 compute-0 nova_compute[186840]: 2026-02-27 17:32:17.873 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:17 compute-0 nova_compute[186840]: 2026-02-27 17:32:17.875 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:18 compute-0 podman[227356]: 2026-02-27 17:32:18.699785488 +0000 UTC m=+0.097939823 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.43.0, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 27 17:32:21 compute-0 nova_compute[186840]: 2026-02-27 17:32:21.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:22 compute-0 nova_compute[186840]: 2026-02-27 17:32:22.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:22 compute-0 nova_compute[186840]: 2026-02-27 17:32:22.875 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:23 compute-0 podman[227376]: 2026-02-27 17:32:23.682542726 +0000 UTC m=+0.074146413 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 27 17:32:23 compute-0 nova_compute[186840]: 2026-02-27 17:32:23.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:25 compute-0 nova_compute[186840]: 2026-02-27 17:32:25.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:25 compute-0 nova_compute[186840]: 2026-02-27 17:32:25.699 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 27 17:32:25 compute-0 nova_compute[186840]: 2026-02-27 17:32:25.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 27 17:32:25 compute-0 nova_compute[186840]: 2026-02-27 17:32:25.800 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 27 17:32:26 compute-0 nova_compute[186840]: 2026-02-27 17:32:26.698 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:26 compute-0 nova_compute[186840]: 2026-02-27 17:32:26.737 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:32:26 compute-0 nova_compute[186840]: 2026-02-27 17:32:26.738 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:32:26 compute-0 nova_compute[186840]: 2026-02-27 17:32:26.738 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:32:26 compute-0 nova_compute[186840]: 2026-02-27 17:32:26.738 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 27 17:32:26 compute-0 nova_compute[186840]: 2026-02-27 17:32:26.942 186844 WARNING nova.virt.libvirt.driver [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 27 17:32:26 compute-0 nova_compute[186840]: 2026-02-27 17:32:26.944 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5740MB free_disk=73.19047927856445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 27 17:32:26 compute-0 nova_compute[186840]: 2026-02-27 17:32:26.945 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:32:26 compute-0 nova_compute[186840]: 2026-02-27 17:32:26.945 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:32:27 compute-0 nova_compute[186840]: 2026-02-27 17:32:27.050 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 27 17:32:27 compute-0 nova_compute[186840]: 2026-02-27 17:32:27.051 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 27 17:32:27 compute-0 nova_compute[186840]: 2026-02-27 17:32:27.216 186844 DEBUG nova.compute.provider_tree [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed in ProviderTree for provider: 2b4df47a-58ba-41db-b94b-eb594c2f9699 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 27 17:32:27 compute-0 nova_compute[186840]: 2026-02-27 17:32:27.238 186844 DEBUG nova.scheduler.client.report [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Inventory has not changed for provider 2b4df47a-58ba-41db-b94b-eb594c2f9699 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 27 17:32:27 compute-0 nova_compute[186840]: 2026-02-27 17:32:27.241 186844 DEBUG nova.compute.resource_tracker [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 27 17:32:27 compute-0 nova_compute[186840]: 2026-02-27 17:32:27.241 186844 DEBUG oslo_concurrency.lockutils [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:32:27 compute-0 nova_compute[186840]: 2026-02-27 17:32:27.877 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:32:28 compute-0 nova_compute[186840]: 2026-02-27 17:32:28.237 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:29 compute-0 nova_compute[186840]: 2026-02-27 17:32:29.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:31 compute-0 nova_compute[186840]: 2026-02-27 17:32:31.700 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:31 compute-0 nova_compute[186840]: 2026-02-27 17:32:31.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 27 17:32:32 compute-0 nova_compute[186840]: 2026-02-27 17:32:32.701 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:32 compute-0 nova_compute[186840]: 2026-02-27 17:32:32.702 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 27 17:32:32 compute-0 nova_compute[186840]: 2026-02-27 17:32:32.734 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 27 17:32:32 compute-0 nova_compute[186840]: 2026-02-27 17:32:32.879 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:35 compute-0 podman[227402]: 2026-02-27 17:32:35.670491226 +0000 UTC m=+0.065808011 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:32:35 compute-0 podman[227401]: 2026-02-27 17:32:35.700063782 +0000 UTC m=+0.100990039 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 27 17:32:35 compute-0 nova_compute[186840]: 2026-02-27 17:32:35.732 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:37 compute-0 nova_compute[186840]: 2026-02-27 17:32:37.881 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:32:39 compute-0 podman[227444]: 2026-02-27 17:32:39.672008749 +0000 UTC m=+0.077546539 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Feb 27 17:32:39 compute-0 podman[227445]: 2026-02-27 17:32:39.737301096 +0000 UTC m=+0.136762742 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.43.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:32:40 compute-0 nova_compute[186840]: 2026-02-27 17:32:40.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:40 compute-0 nova_compute[186840]: 2026-02-27 17:32:40.700 186844 DEBUG nova.compute.manager [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 27 17:32:40 compute-0 nova_compute[186840]: 2026-02-27 17:32:40.717 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:32:42 compute-0 nova_compute[186840]: 2026-02-27 17:32:42.882 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:32:47.114 106085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 27 17:32:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:32:47.115 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 27 17:32:47 compute-0 ovn_metadata_agent[106080]: 2026-02-27 17:32:47.115 106085 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 27 17:32:47 compute-0 nova_compute[186840]: 2026-02-27 17:32:47.884 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:49 compute-0 podman[227491]: 2026-02-27 17:32:49.669725858 +0000 UTC m=+0.072028829 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, io.buildah.version=1.43.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 27 17:32:52 compute-0 nova_compute[186840]: 2026-02-27 17:32:52.886 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:32:52 compute-0 nova_compute[186840]: 2026-02-27 17:32:52.888 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:32:52 compute-0 nova_compute[186840]: 2026-02-27 17:32:52.888 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:32:52 compute-0 nova_compute[186840]: 2026-02-27 17:32:52.889 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:32:52 compute-0 nova_compute[186840]: 2026-02-27 17:32:52.916 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:32:52 compute-0 nova_compute[186840]: 2026-02-27 17:32:52.917 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:32:54 compute-0 podman[227511]: 2026-02-27 17:32:54.663051441 +0000 UTC m=+0.062043347 container health_status 284af8a7307110cdd095947150967c89974b33ea9099990b78691a8ac37227c7 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 27 17:32:57 compute-0 nova_compute[186840]: 2026-02-27 17:32:57.917 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:33:02 compute-0 nova_compute[186840]: 2026-02-27 17:33:02.919 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:33:02 compute-0 nova_compute[186840]: 2026-02-27 17:33:02.922 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:33:02 compute-0 nova_compute[186840]: 2026-02-27 17:33:02.922 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:33:02 compute-0 nova_compute[186840]: 2026-02-27 17:33:02.923 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:33:02 compute-0 nova_compute[186840]: 2026-02-27 17:33:02.923 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:33:02 compute-0 nova_compute[186840]: 2026-02-27 17:33:02.926 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:33:06 compute-0 podman[227535]: 2026-02-27 17:33:06.66764197 +0000 UTC m=+0.072679435 container health_status 98dac693f00e7037eeffd9be83141a78b9418d6ce03f59d9d044bd8a1fd8a423 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/lib/openstack/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 27 17:33:06 compute-0 podman[227536]: 2026-02-27 17:33:06.709187149 +0000 UTC m=+0.103974065 container health_status adaaf9bd3218784d49d4d6b8db516f0f6b367e7c364afcf240992dd42c326366 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.43.0, maintainer=OpenStack Kubernetes Operator team)
Feb 27 17:33:07 compute-0 nova_compute[186840]: 2026-02-27 17:33:07.924 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:33:07 compute-0 nova_compute[186840]: 2026-02-27 17:33:07.927 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:33:10 compute-0 podman[227579]: 2026-02-27 17:33:10.673768619 +0000 UTC m=+0.079070527 container health_status 55d86fba1f345de13d4832cbcdda25de5bd54bd21018d102de6634ede06042b4 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 27 17:33:10 compute-0 podman[227580]: 2026-02-27 17:33:10.74473438 +0000 UTC m=+0.141351279 container health_status 958f45964245d79693a5b165773ae16e6380dc3747298f93d00974a9c6fb7580 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.43.0, managed_by=edpm_ansible, org.label-schema.build-date=20260223)
Feb 27 17:33:11 compute-0 sshd-session[227626]: Accepted publickey for zuul from 192.168.122.10 port 52726 ssh2: ECDSA SHA256:IBu+T0gHJoL/DxdPTdczCapnN2POqQCmPF/QqQnd5JQ
Feb 27 17:33:11 compute-0 systemd-logind[803]: New session 29 of user zuul.
Feb 27 17:33:11 compute-0 systemd[1]: Started Session 29 of User zuul.
Feb 27 17:33:11 compute-0 sshd-session[227626]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Feb 27 17:33:12 compute-0 sudo[227630]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Feb 27 17:33:12 compute-0 sudo[227630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 27 17:33:12 compute-0 nova_compute[186840]: 2026-02-27 17:33:12.927 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:33:15 compute-0 ovs-vsctl[227800]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 27 17:33:16 compute-0 virtqemud[186011]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 27 17:33:16 compute-0 virtqemud[186011]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 27 17:33:16 compute-0 virtqemud[186011]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 27 17:33:17 compute-0 crontab[228202]: (root) LIST (root)
Feb 27 17:33:17 compute-0 nova_compute[186840]: 2026-02-27 17:33:17.928 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:33:19 compute-0 systemd[1]: Starting Hostname Service...
Feb 27 17:33:19 compute-0 systemd[1]: Started Hostname Service.
Feb 27 17:33:19 compute-0 podman[228317]: 2026-02-27 17:33:19.805193652 +0000 UTC m=+0.062085048 container health_status b35bc98ba51ce4045024ba64174e9ba82e4df5150ac25b012907d9fe337ca933 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=8419493e1fd846703d277695e03fc5eb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '5570d3e7026baadc267903a2328decf51de6bc7bcc77e9f91f885547415ae1b4-afec5b3af7f573e7b98fde7b7078a653e8d9f675c7db2295023a041f95c70f8d-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535-269487b8ce61d83ea4ecebd3a1867ac9ca611efee1213aefce612c11bb986535'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, io.buildah.version=1.43.0, managed_by=edpm_ansible)
Feb 27 17:33:22 compute-0 nova_compute[186840]: 2026-02-27 17:33:22.729 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 27 17:33:22 compute-0 nova_compute[186840]: 2026-02-27 17:33:22.930 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:33:22 compute-0 nova_compute[186840]: 2026-02-27 17:33:22.932 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 27 17:33:22 compute-0 nova_compute[186840]: 2026-02-27 17:33:22.932 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 27 17:33:22 compute-0 nova_compute[186840]: 2026-02-27 17:33:22.932 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:33:22 compute-0 nova_compute[186840]: 2026-02-27 17:33:22.955 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 27 17:33:22 compute-0 nova_compute[186840]: 2026-02-27 17:33:22.955 186844 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 27 17:33:23 compute-0 nova_compute[186840]: 2026-02-27 17:33:23.699 186844 DEBUG oslo_service.periodic_task [None req-53ed4689-f38d-4129-9553-85a8db358148 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
